DeepDream

This is a creative project using Google's pre-trained model of DeepDream. It allows to visualise patters that a neural network learns and therefore, produces interesting results to input images.

DeepDream will over-enhance the patterns in an image using gradient ascent (instead of descent); images produced are almost dream like.

Layers of the network can be specified for extra importance, meaning they change the image depending on the layers you choose. I will explore this, including exploration of different covnets, and image types (e.g. natural and not).

Code

The code and some explanations will be used/gotten from https://www.tensorflow.org/tutorials/generative/deepdream#optional_scaling_up_with_tiles. It has a minimal implementation of DeepDream written by Alexander Mordvintsev. This will be used as the base, so after the model can be changed and images generated with different exadurated layers. As this is more of a creative project, I will focus on the final image result and which show the most interesting results. There will be an exploration here, and on a pdf alongside this project, there will be all of the generated images labelled with which layers and covnets were used to generate such results.

Also, this site: https://www.mlq.ai/deep-dream-with-tensorflow-2-0/ was used to help with some of the code understanding.

In [ ]:
import tensorflow as tf
import numpy as np
import matplotlib as mpl
import IPython.display as display
import PIL.Image
from tensorflow.keras.preprocessing import image
import time

For the image to be changed (and the first image is of a parrot, it is a test image just to decide which verion to use of the code provided on TensorFlow.org and mentioned above) The image itself was gotten from https://pixabay.com/ which provides royalty free stock images. The images I will use are on my github and are accessed here:

In [ ]:
url = 'https://github.com/SimasCes/DeepDream/blob/main/parrot.jpg?raw=true'
In [ ]:
# Download an image and read it into a NumPy array.
def download(url, max_dim=None):
  name = url.split('/')[-1]
  image_path = tf.keras.utils.get_file(name, origin=url)
  img = PIL.Image.open(image_path)
  if max_dim:
    img.thumbnail((max_dim, max_dim))
  return np.array(img)

# Normalize an image
def deprocess(img):
  img = 255*(img + 1.0)/2.0
  return tf.cast(img, tf.uint8)

# Display an image
def show(img):
  display.display(PIL.Image.fromarray(np.array(img)))


# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)
show(original_img)

This is where we download a pre-trained classification model. This is where different covnets can be used.

In [ ]:
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')
In [ ]:
# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

The loss = sum of activations from chosen layers (names variable above). In gradient acent you want to maximise this value.

In [ ]:
def calc_loss(img, model):
  # Pass forward the image through the model to retrieve the activations.
  # Converts the image into a batch of size 1.
  img_batch = tf.expand_dims(img, axis=0)
  layer_activations = model(img_batch)
  if len(layer_activations) == 1:
    layer_activations = [layer_activations]

  losses = []
  for act in layer_activations:
    loss = tf.math.reduce_mean(act)
    losses.append(loss)

  return  tf.reduce_sum(losses)

Now you calcualte the gradients and add them to the image.

In [ ]:
class DeepDream(tf.Module):
  def __init__(self, model):
    self.model = model

  @tf.function(
      input_signature=(
        tf.TensorSpec(shape=[None,None,3], dtype=tf.float32),
        tf.TensorSpec(shape=[], dtype=tf.int32),
        tf.TensorSpec(shape=[], dtype=tf.float32),)
  )
  def __call__(self, img, steps, step_size):
      print("Tracing")
      loss = tf.constant(0.0)
      for n in tf.range(steps):
        with tf.GradientTape() as tape:
          # This needs gradients relative to `img`
          # `GradientTape` only watches `tf.Variable`s by default
          tape.watch(img)
          loss = calc_loss(img, self.model)

        # Calculate the gradient of the loss with respect to the pixels of the input image.
        gradients = tape.gradient(loss, img)

        # Normalize the gradients.
        gradients /= tf.math.reduce_std(gradients) + 1e-8 

        # In gradient ascent, the "loss" is maximized so that the input image increasingly "excites" the layers.
        # You can update the image by directly adding the gradients (because they're the same shape!)
        img = img + gradients*step_size
        img = tf.clip_by_value(img, -1, 1)

      return loss, img
In [ ]:
deepdream = DeepDream(dream_model)

Simple

Main loop is below that runs the algorithm for DeepDream.

In [ ]:
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))


  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result
In [ ]:
dream_img = run_deep_dream_simple(img=original_img, 
                                  steps=100, step_size=0.01)

Octave

You can add an octave. It makes sure that the image is not noisy, low resolution, or and patters have different granularity. You can implement the previous approach, then increase the size of the image (octave). This is repeated.

In [ ]:
import time
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
236.40856385231018

The one below is the same, but the step size has been increased to see the image have more intensity.

In [ ]:
import time
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=10, step_size=0.05)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
49.014517068862915

Tiles

Lastly, you can scale up with tiles. This is great for larger images as the image will be split into tiles, and a gradient will be implemented for each tile (more information on the approach found on https://www.tensorflow.org/tutorials/generative/deepdream#optional_scaling_up_with_tiles where the code was found for DeepDream).

In [ ]:
def random_roll(img, maxroll):
  # Randomly shift the image to avoid tiled boundaries.
  shift = tf.random.uniform(shape=[2], minval=-maxroll, maxval=maxroll, dtype=tf.int32)
  img_rolled = tf.roll(img, shift=shift, axis=[0,1])
  return shift, img_rolled
In [ ]:
shift, img_rolled = random_roll(np.array(original_img), 512)
show(img_rolled)
In [ ]:
class TiledGradients(tf.Module):
  def __init__(self, model):
    self.model = model

  @tf.function(
      input_signature=(
        tf.TensorSpec(shape=[None,None,3], dtype=tf.float32),
        tf.TensorSpec(shape=[], dtype=tf.int32),)
  )
  def __call__(self, img, tile_size=512):
    shift, img_rolled = random_roll(img, tile_size)

    # Initialize the image gradients to zero.
    gradients = tf.zeros_like(img_rolled)

    # Skip the last tile, unless there's only one tile.
    xs = tf.range(0, img_rolled.shape[0], tile_size)[:-1]
    if not tf.cast(len(xs), bool):
      xs = tf.constant([0])
    ys = tf.range(0, img_rolled.shape[1], tile_size)[:-1]
    if not tf.cast(len(ys), bool):
      ys = tf.constant([0])

    for x in xs:
      for y in ys:
        # Calculate the gradients for this tile.
        with tf.GradientTape() as tape:
          # This needs gradients relative to `img_rolled`.
          # `GradientTape` only watches `tf.Variable`s by default.
          tape.watch(img_rolled)

          # Extract a tile out of the image.
          img_tile = img_rolled[x:x+tile_size, y:y+tile_size]
          loss = calc_loss(img_tile, self.model)

        # Update the image gradients for this tile.
        gradients = gradients + tape.gradient(loss, img_rolled)

    # Undo the random shift applied to the image and its gradients.
    gradients = tf.roll(gradients, shift=-shift, axis=[0,1])

    # Normalize the gradients.
    gradients /= tf.math.reduce_std(gradients) + 1e-8 

    return gradients
In [ ]:
get_tiled_gradients = TiledGradients(dream_model)
In [ ]:
def run_deep_dream_with_octaves(img, steps_per_octave=100, step_size=0.01, 
                                octaves=range(-2,3), octave_scale=1.3):
  base_shape = tf.shape(img)
  img = tf.keras.preprocessing.image.img_to_array(img)
  img = tf.keras.applications.inception_v3.preprocess_input(img)

  initial_shape = img.shape[:-1]
  img = tf.image.resize(img, initial_shape)
  for octave in octaves:
    # Scale the image based on the octave
    new_size = tf.cast(tf.convert_to_tensor(base_shape[:-1]), tf.float32)*(octave_scale**octave)
    img = tf.image.resize(img, tf.cast(new_size, tf.int32))

    for step in range(steps_per_octave):
      gradients = get_tiled_gradients(img)
      img = img + gradients*step_size
      img = tf.clip_by_value(img, -1, 1)

      if step % 10 == 0:
        display.clear_output(wait=True)
        show(deprocess(img))
        print ("Octave {}, Step {}".format(octave, step))

  result = deprocess(img)
  return result
In [ ]:
img = run_deep_dream_with_octaves(img=original_img, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

First exploration

Overall, the last technique (scaling up with tiles) is very similar to the octaves technique. The octave result has more rought edges within the image or more grain, while the tiles tehnique has a more smooth effect.

The tiles technique however, takes a lot longer to run. They also are aimed at large images, and I do not have overly large images + my images will be made smaller to be seen well on the notebook. This is why I will try to use the octaves technique to experiemnt and might run the best images with the tiles technique at the end (as at the end due to this being a creative project I expect to have a few interesting images from the experimentations done).

Images

As I have chosen a technique, now I will choose some images that I will be running the algorithm on, and experimenting on. I have chosen to try this on 3 images: one natural, one urban, and one artistic. Due to straight lines used in the urban and interesting technique in the articstic I hypothesise there will be a difference in how there images look at the end.

Some explorations will be kept in the notebook however, there will be an extra pdf of all of the outcomes and what was used to gather these outcomes. It will act as almost a sketchbook for all of the results and will show all of the experimentations, plus make it cleaner to see the experiemntation and results without the code (as the code will be on this notebook).

These are the 3 images I chose (the variable name is the same, as they will be uploaded one at a time for the model anyway, this just makes the code more runable):

In [ ]:
original_img = download('https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true', max_dim=500)
show(original_img)
Downloading data from https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true
827392/823871 [==============================] - 0s 0us/step

Image gotten from https://pixabay.com/

In [ ]:
original_img = download('https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true', max_dim=500)
show(original_img)
Downloading data from https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true
802816/802148 [==============================] - 0s 0us/step

Image gotten from https://pixabay.com/

In [ ]:
original_img = download('https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true', max_dim=500)
show(original_img)
Downloading data from https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true
253952/248817 [==============================] - 0s 0us/step

STARTING

Eventhough I have already started, this will be the actual exploration with the 3 images. This is where my input comes in and where I change the variables and parameters of the convets themselves instead of using code. It will start with the basic model which was covnet = InceptionV3, layers = (mixed3, mixed5), octave scale = 1.30, step size = 0.01 (except for 1 trial), steps = 50.

There will be systematically explored to make each of the images interesting. I will test the same steps for all the images but will only keep a few images to not make the notebook extremely long. Instead, as mentioned before, a pdf will be made to keep all of the images and their parameters.

Also, I will now copy some of the code from before, it may be redundant but there is a very good reason for this. I will have to run the code many times. To avoid running many different cells the code has been conacatenated together, to make running them a lot easier and quicker.

base_model.summary() will also be used as then the layers of the model can be seen and chosen for personalisation/adjustment.

Code that defines functions so it can stay the same (you only have to run the imports from before):

In [ ]:
# Download an image and read it into a NumPy array.
def download(url, max_dim=None):
  name = url.split('/')[-1]
  image_path = tf.keras.utils.get_file(name, origin=url)
  img = PIL.Image.open(image_path)
  if max_dim:
    img.thumbnail((max_dim, max_dim))
  return np.array(img)

# Normalize an image
def deprocess(img):
  img = 255*(img + 1.0)/2.0
  return tf.cast(img, tf.uint8)

# Display an image
def show(img):
  display.display(PIL.Image.fromarray(np.array(img)))

def calc_loss(img, model):
  # Pass forward the image through the model to retrieve the activations.
  # Converts the image into a batch of size 1.
  img_batch = tf.expand_dims(img, axis=0)
  layer_activations = model(img_batch)
  if len(layer_activations) == 1:
    layer_activations = [layer_activations]

  losses = []
  for act in layer_activations:
    loss = tf.math.reduce_mean(act)
    losses.append(loss)

  return  tf.reduce_sum(losses)


class DeepDream(tf.Module):
  def __init__(self, model):
    self.model = model

  @tf.function(
      input_signature=(
        tf.TensorSpec(shape=[None,None,3], dtype=tf.float32),
        tf.TensorSpec(shape=[], dtype=tf.int32),
        tf.TensorSpec(shape=[], dtype=tf.float32),)
  )
  def __call__(self, img, steps, step_size):
      print("Tracing")
      loss = tf.constant(0.0)
      for n in tf.range(steps):
        with tf.GradientTape() as tape:
          # This needs gradients relative to `img`
          # `GradientTape` only watches `tf.Variable`s by default
          tape.watch(img)
          loss = calc_loss(img, self.model)

        # Calculate the gradient of the loss with respect to the pixels of the input image.
        gradients = tape.gradient(loss, img)

        # Normalize the gradients.
        gradients /= tf.math.reduce_std(gradients) + 1e-8 

        # In gradient ascent, the "loss" is maximized so that the input image increasingly "excites" the layers.
        # You can update the image by directly adding the gradients (because they're the same shape!)
        img = img + gradients*step_size
        img = tf.clip_by_value(img, -1, 1)

      return loss, img

Covnets/Models

There is a list of available models I will be choosing from found on: https://keras.io/api/applications/ . The tensoflow article also said to only concatenate Convoluted layers, but I have also tried concatenating Conv2D layers alongside or instead of Convolutional layers as this will give me more interesting results, and may give my project a different approach/result from others.

InceptionV3

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')
base_model.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/inception_v3/inception_v3_weights_tf_dim_ordering_tf_kernels_notop.h5
87916544/87910968 [==============================] - 0s 0us/step
Model: "inception_v3"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, None, None,  0                                            
__________________________________________________________________________________________________
conv2d (Conv2D)                 (None, None, None, 3 864         input_1[0][0]                    
__________________________________________________________________________________________________
batch_normalization (BatchNorma (None, None, None, 3 96          conv2d[0][0]                     
__________________________________________________________________________________________________
activation (Activation)         (None, None, None, 3 0           batch_normalization[0][0]        
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, None, None, 3 9216        activation[0][0]                 
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, None, None, 3 96          conv2d_1[0][0]                   
__________________________________________________________________________________________________
activation_1 (Activation)       (None, None, None, 3 0           batch_normalization_1[0][0]      
__________________________________________________________________________________________________
conv2d_2 (Conv2D)               (None, None, None, 6 18432       activation_1[0][0]               
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, None, None, 6 192         conv2d_2[0][0]                   
__________________________________________________________________________________________________
activation_2 (Activation)       (None, None, None, 6 0           batch_normalization_2[0][0]      
__________________________________________________________________________________________________
max_pooling2d (MaxPooling2D)    (None, None, None, 6 0           activation_2[0][0]               
__________________________________________________________________________________________________
conv2d_3 (Conv2D)               (None, None, None, 8 5120        max_pooling2d[0][0]              
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, None, None, 8 240         conv2d_3[0][0]                   
__________________________________________________________________________________________________
activation_3 (Activation)       (None, None, None, 8 0           batch_normalization_3[0][0]      
__________________________________________________________________________________________________
conv2d_4 (Conv2D)               (None, None, None, 1 138240      activation_3[0][0]               
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, None, None, 1 576         conv2d_4[0][0]                   
__________________________________________________________________________________________________
activation_4 (Activation)       (None, None, None, 1 0           batch_normalization_4[0][0]      
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)  (None, None, None, 1 0           activation_4[0][0]               
__________________________________________________________________________________________________
conv2d_8 (Conv2D)               (None, None, None, 6 12288       max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, None, None, 6 192         conv2d_8[0][0]                   
__________________________________________________________________________________________________
activation_8 (Activation)       (None, None, None, 6 0           batch_normalization_8[0][0]      
__________________________________________________________________________________________________
conv2d_6 (Conv2D)               (None, None, None, 4 9216        max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
conv2d_9 (Conv2D)               (None, None, None, 9 55296       activation_8[0][0]               
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, None, None, 4 144         conv2d_6[0][0]                   
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, None, None, 9 288         conv2d_9[0][0]                   
__________________________________________________________________________________________________
activation_6 (Activation)       (None, None, None, 4 0           batch_normalization_6[0][0]      
__________________________________________________________________________________________________
activation_9 (Activation)       (None, None, None, 9 0           batch_normalization_9[0][0]      
__________________________________________________________________________________________________
average_pooling2d (AveragePooli (None, None, None, 1 0           max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
conv2d_5 (Conv2D)               (None, None, None, 6 12288       max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
conv2d_7 (Conv2D)               (None, None, None, 6 76800       activation_6[0][0]               
__________________________________________________________________________________________________
conv2d_10 (Conv2D)              (None, None, None, 9 82944       activation_9[0][0]               
__________________________________________________________________________________________________
conv2d_11 (Conv2D)              (None, None, None, 3 6144        average_pooling2d[0][0]          
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, None, None, 6 192         conv2d_5[0][0]                   
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, None, None, 6 192         conv2d_7[0][0]                   
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, None, None, 9 288         conv2d_10[0][0]                  
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, None, None, 3 96          conv2d_11[0][0]                  
__________________________________________________________________________________________________
activation_5 (Activation)       (None, None, None, 6 0           batch_normalization_5[0][0]      
__________________________________________________________________________________________________
activation_7 (Activation)       (None, None, None, 6 0           batch_normalization_7[0][0]      
__________________________________________________________________________________________________
activation_10 (Activation)      (None, None, None, 9 0           batch_normalization_10[0][0]     
__________________________________________________________________________________________________
activation_11 (Activation)      (None, None, None, 3 0           batch_normalization_11[0][0]     
__________________________________________________________________________________________________
mixed0 (Concatenate)            (None, None, None, 2 0           activation_5[0][0]               
                                                                 activation_7[0][0]               
                                                                 activation_10[0][0]              
                                                                 activation_11[0][0]              
__________________________________________________________________________________________________
conv2d_15 (Conv2D)              (None, None, None, 6 16384       mixed0[0][0]                     
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, None, None, 6 192         conv2d_15[0][0]                  
__________________________________________________________________________________________________
activation_15 (Activation)      (None, None, None, 6 0           batch_normalization_15[0][0]     
__________________________________________________________________________________________________
conv2d_13 (Conv2D)              (None, None, None, 4 12288       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_16 (Conv2D)              (None, None, None, 9 55296       activation_15[0][0]              
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, None, None, 4 144         conv2d_13[0][0]                  
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, None, None, 9 288         conv2d_16[0][0]                  
__________________________________________________________________________________________________
activation_13 (Activation)      (None, None, None, 4 0           batch_normalization_13[0][0]     
__________________________________________________________________________________________________
activation_16 (Activation)      (None, None, None, 9 0           batch_normalization_16[0][0]     
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, None, None, 2 0           mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_12 (Conv2D)              (None, None, None, 6 16384       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_14 (Conv2D)              (None, None, None, 6 76800       activation_13[0][0]              
__________________________________________________________________________________________________
conv2d_17 (Conv2D)              (None, None, None, 9 82944       activation_16[0][0]              
__________________________________________________________________________________________________
conv2d_18 (Conv2D)              (None, None, None, 6 16384       average_pooling2d_1[0][0]        
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, None, None, 6 192         conv2d_12[0][0]                  
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, None, None, 6 192         conv2d_14[0][0]                  
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, None, None, 9 288         conv2d_17[0][0]                  
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, None, None, 6 192         conv2d_18[0][0]                  
__________________________________________________________________________________________________
activation_12 (Activation)      (None, None, None, 6 0           batch_normalization_12[0][0]     
__________________________________________________________________________________________________
activation_14 (Activation)      (None, None, None, 6 0           batch_normalization_14[0][0]     
__________________________________________________________________________________________________
activation_17 (Activation)      (None, None, None, 9 0           batch_normalization_17[0][0]     
__________________________________________________________________________________________________
activation_18 (Activation)      (None, None, None, 6 0           batch_normalization_18[0][0]     
__________________________________________________________________________________________________
mixed1 (Concatenate)            (None, None, None, 2 0           activation_12[0][0]              
                                                                 activation_14[0][0]              
                                                                 activation_17[0][0]              
                                                                 activation_18[0][0]              
__________________________________________________________________________________________________
conv2d_22 (Conv2D)              (None, None, None, 6 18432       mixed1[0][0]                     
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, None, None, 6 192         conv2d_22[0][0]                  
__________________________________________________________________________________________________
activation_22 (Activation)      (None, None, None, 6 0           batch_normalization_22[0][0]     
__________________________________________________________________________________________________
conv2d_20 (Conv2D)              (None, None, None, 4 13824       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_23 (Conv2D)              (None, None, None, 9 55296       activation_22[0][0]              
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, None, None, 4 144         conv2d_20[0][0]                  
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, None, None, 9 288         conv2d_23[0][0]                  
__________________________________________________________________________________________________
activation_20 (Activation)      (None, None, None, 4 0           batch_normalization_20[0][0]     
__________________________________________________________________________________________________
activation_23 (Activation)      (None, None, None, 9 0           batch_normalization_23[0][0]     
__________________________________________________________________________________________________
average_pooling2d_2 (AveragePoo (None, None, None, 2 0           mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_19 (Conv2D)              (None, None, None, 6 18432       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_21 (Conv2D)              (None, None, None, 6 76800       activation_20[0][0]              
__________________________________________________________________________________________________
conv2d_24 (Conv2D)              (None, None, None, 9 82944       activation_23[0][0]              
__________________________________________________________________________________________________
conv2d_25 (Conv2D)              (None, None, None, 6 18432       average_pooling2d_2[0][0]        
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, None, None, 6 192         conv2d_19[0][0]                  
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, None, None, 6 192         conv2d_21[0][0]                  
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, None, None, 9 288         conv2d_24[0][0]                  
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, None, None, 6 192         conv2d_25[0][0]                  
__________________________________________________________________________________________________
activation_19 (Activation)      (None, None, None, 6 0           batch_normalization_19[0][0]     
__________________________________________________________________________________________________
activation_21 (Activation)      (None, None, None, 6 0           batch_normalization_21[0][0]     
__________________________________________________________________________________________________
activation_24 (Activation)      (None, None, None, 9 0           batch_normalization_24[0][0]     
__________________________________________________________________________________________________
activation_25 (Activation)      (None, None, None, 6 0           batch_normalization_25[0][0]     
__________________________________________________________________________________________________
mixed2 (Concatenate)            (None, None, None, 2 0           activation_19[0][0]              
                                                                 activation_21[0][0]              
                                                                 activation_24[0][0]              
                                                                 activation_25[0][0]              
__________________________________________________________________________________________________
conv2d_27 (Conv2D)              (None, None, None, 6 18432       mixed2[0][0]                     
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, None, None, 6 192         conv2d_27[0][0]                  
__________________________________________________________________________________________________
activation_27 (Activation)      (None, None, None, 6 0           batch_normalization_27[0][0]     
__________________________________________________________________________________________________
conv2d_28 (Conv2D)              (None, None, None, 9 55296       activation_27[0][0]              
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, None, None, 9 288         conv2d_28[0][0]                  
__________________________________________________________________________________________________
activation_28 (Activation)      (None, None, None, 9 0           batch_normalization_28[0][0]     
__________________________________________________________________________________________________
conv2d_26 (Conv2D)              (None, None, None, 3 995328      mixed2[0][0]                     
__________________________________________________________________________________________________
conv2d_29 (Conv2D)              (None, None, None, 9 82944       activation_28[0][0]              
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, None, None, 3 1152        conv2d_26[0][0]                  
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, None, None, 9 288         conv2d_29[0][0]                  
__________________________________________________________________________________________________
activation_26 (Activation)      (None, None, None, 3 0           batch_normalization_26[0][0]     
__________________________________________________________________________________________________
activation_29 (Activation)      (None, None, None, 9 0           batch_normalization_29[0][0]     
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)  (None, None, None, 2 0           mixed2[0][0]                     
__________________________________________________________________________________________________
mixed3 (Concatenate)            (None, None, None, 7 0           activation_26[0][0]              
                                                                 activation_29[0][0]              
                                                                 max_pooling2d_2[0][0]            
__________________________________________________________________________________________________
conv2d_34 (Conv2D)              (None, None, None, 1 98304       mixed3[0][0]                     
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, None, None, 1 384         conv2d_34[0][0]                  
__________________________________________________________________________________________________
activation_34 (Activation)      (None, None, None, 1 0           batch_normalization_34[0][0]     
__________________________________________________________________________________________________
conv2d_35 (Conv2D)              (None, None, None, 1 114688      activation_34[0][0]              
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, None, None, 1 384         conv2d_35[0][0]                  
__________________________________________________________________________________________________
activation_35 (Activation)      (None, None, None, 1 0           batch_normalization_35[0][0]     
__________________________________________________________________________________________________
conv2d_31 (Conv2D)              (None, None, None, 1 98304       mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_36 (Conv2D)              (None, None, None, 1 114688      activation_35[0][0]              
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, None, None, 1 384         conv2d_31[0][0]                  
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, None, None, 1 384         conv2d_36[0][0]                  
__________________________________________________________________________________________________
activation_31 (Activation)      (None, None, None, 1 0           batch_normalization_31[0][0]     
__________________________________________________________________________________________________
activation_36 (Activation)      (None, None, None, 1 0           batch_normalization_36[0][0]     
__________________________________________________________________________________________________
conv2d_32 (Conv2D)              (None, None, None, 1 114688      activation_31[0][0]              
__________________________________________________________________________________________________
conv2d_37 (Conv2D)              (None, None, None, 1 114688      activation_36[0][0]              
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, None, None, 1 384         conv2d_32[0][0]                  
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, None, None, 1 384         conv2d_37[0][0]                  
__________________________________________________________________________________________________
activation_32 (Activation)      (None, None, None, 1 0           batch_normalization_32[0][0]     
__________________________________________________________________________________________________
activation_37 (Activation)      (None, None, None, 1 0           batch_normalization_37[0][0]     
__________________________________________________________________________________________________
average_pooling2d_3 (AveragePoo (None, None, None, 7 0           mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_30 (Conv2D)              (None, None, None, 1 147456      mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_33 (Conv2D)              (None, None, None, 1 172032      activation_32[0][0]              
__________________________________________________________________________________________________
conv2d_38 (Conv2D)              (None, None, None, 1 172032      activation_37[0][0]              
__________________________________________________________________________________________________
conv2d_39 (Conv2D)              (None, None, None, 1 147456      average_pooling2d_3[0][0]        
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, None, None, 1 576         conv2d_30[0][0]                  
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, None, None, 1 576         conv2d_33[0][0]                  
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, None, None, 1 576         conv2d_38[0][0]                  
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, None, None, 1 576         conv2d_39[0][0]                  
__________________________________________________________________________________________________
activation_30 (Activation)      (None, None, None, 1 0           batch_normalization_30[0][0]     
__________________________________________________________________________________________________
activation_33 (Activation)      (None, None, None, 1 0           batch_normalization_33[0][0]     
__________________________________________________________________________________________________
activation_38 (Activation)      (None, None, None, 1 0           batch_normalization_38[0][0]     
__________________________________________________________________________________________________
activation_39 (Activation)      (None, None, None, 1 0           batch_normalization_39[0][0]     
__________________________________________________________________________________________________
mixed4 (Concatenate)            (None, None, None, 7 0           activation_30[0][0]              
                                                                 activation_33[0][0]              
                                                                 activation_38[0][0]              
                                                                 activation_39[0][0]              
__________________________________________________________________________________________________
conv2d_44 (Conv2D)              (None, None, None, 1 122880      mixed4[0][0]                     
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, None, None, 1 480         conv2d_44[0][0]                  
__________________________________________________________________________________________________
activation_44 (Activation)      (None, None, None, 1 0           batch_normalization_44[0][0]     
__________________________________________________________________________________________________
conv2d_45 (Conv2D)              (None, None, None, 1 179200      activation_44[0][0]              
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, None, None, 1 480         conv2d_45[0][0]                  
__________________________________________________________________________________________________
activation_45 (Activation)      (None, None, None, 1 0           batch_normalization_45[0][0]     
__________________________________________________________________________________________________
conv2d_41 (Conv2D)              (None, None, None, 1 122880      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_46 (Conv2D)              (None, None, None, 1 179200      activation_45[0][0]              
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, None, None, 1 480         conv2d_41[0][0]                  
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, None, None, 1 480         conv2d_46[0][0]                  
__________________________________________________________________________________________________
activation_41 (Activation)      (None, None, None, 1 0           batch_normalization_41[0][0]     
__________________________________________________________________________________________________
activation_46 (Activation)      (None, None, None, 1 0           batch_normalization_46[0][0]     
__________________________________________________________________________________________________
conv2d_42 (Conv2D)              (None, None, None, 1 179200      activation_41[0][0]              
__________________________________________________________________________________________________
conv2d_47 (Conv2D)              (None, None, None, 1 179200      activation_46[0][0]              
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, None, None, 1 480         conv2d_42[0][0]                  
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, None, None, 1 480         conv2d_47[0][0]                  
__________________________________________________________________________________________________
activation_42 (Activation)      (None, None, None, 1 0           batch_normalization_42[0][0]     
__________________________________________________________________________________________________
activation_47 (Activation)      (None, None, None, 1 0           batch_normalization_47[0][0]     
__________________________________________________________________________________________________
average_pooling2d_4 (AveragePoo (None, None, None, 7 0           mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_40 (Conv2D)              (None, None, None, 1 147456      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_43 (Conv2D)              (None, None, None, 1 215040      activation_42[0][0]              
__________________________________________________________________________________________________
conv2d_48 (Conv2D)              (None, None, None, 1 215040      activation_47[0][0]              
__________________________________________________________________________________________________
conv2d_49 (Conv2D)              (None, None, None, 1 147456      average_pooling2d_4[0][0]        
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, None, None, 1 576         conv2d_40[0][0]                  
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, None, None, 1 576         conv2d_43[0][0]                  
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, None, None, 1 576         conv2d_48[0][0]                  
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, None, None, 1 576         conv2d_49[0][0]                  
__________________________________________________________________________________________________
activation_40 (Activation)      (None, None, None, 1 0           batch_normalization_40[0][0]     
__________________________________________________________________________________________________
activation_43 (Activation)      (None, None, None, 1 0           batch_normalization_43[0][0]     
__________________________________________________________________________________________________
activation_48 (Activation)      (None, None, None, 1 0           batch_normalization_48[0][0]     
__________________________________________________________________________________________________
activation_49 (Activation)      (None, None, None, 1 0           batch_normalization_49[0][0]     
__________________________________________________________________________________________________
mixed5 (Concatenate)            (None, None, None, 7 0           activation_40[0][0]              
                                                                 activation_43[0][0]              
                                                                 activation_48[0][0]              
                                                                 activation_49[0][0]              
__________________________________________________________________________________________________
conv2d_54 (Conv2D)              (None, None, None, 1 122880      mixed5[0][0]                     
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, None, None, 1 480         conv2d_54[0][0]                  
__________________________________________________________________________________________________
activation_54 (Activation)      (None, None, None, 1 0           batch_normalization_54[0][0]     
__________________________________________________________________________________________________
conv2d_55 (Conv2D)              (None, None, None, 1 179200      activation_54[0][0]              
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, None, None, 1 480         conv2d_55[0][0]                  
__________________________________________________________________________________________________
activation_55 (Activation)      (None, None, None, 1 0           batch_normalization_55[0][0]     
__________________________________________________________________________________________________
conv2d_51 (Conv2D)              (None, None, None, 1 122880      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_56 (Conv2D)              (None, None, None, 1 179200      activation_55[0][0]              
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, None, None, 1 480         conv2d_51[0][0]                  
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, None, None, 1 480         conv2d_56[0][0]                  
__________________________________________________________________________________________________
activation_51 (Activation)      (None, None, None, 1 0           batch_normalization_51[0][0]     
__________________________________________________________________________________________________
activation_56 (Activation)      (None, None, None, 1 0           batch_normalization_56[0][0]     
__________________________________________________________________________________________________
conv2d_52 (Conv2D)              (None, None, None, 1 179200      activation_51[0][0]              
__________________________________________________________________________________________________
conv2d_57 (Conv2D)              (None, None, None, 1 179200      activation_56[0][0]              
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, None, None, 1 480         conv2d_52[0][0]                  
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, None, None, 1 480         conv2d_57[0][0]                  
__________________________________________________________________________________________________
activation_52 (Activation)      (None, None, None, 1 0           batch_normalization_52[0][0]     
__________________________________________________________________________________________________
activation_57 (Activation)      (None, None, None, 1 0           batch_normalization_57[0][0]     
__________________________________________________________________________________________________
average_pooling2d_5 (AveragePoo (None, None, None, 7 0           mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_50 (Conv2D)              (None, None, None, 1 147456      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_53 (Conv2D)              (None, None, None, 1 215040      activation_52[0][0]              
__________________________________________________________________________________________________
conv2d_58 (Conv2D)              (None, None, None, 1 215040      activation_57[0][0]              
__________________________________________________________________________________________________
conv2d_59 (Conv2D)              (None, None, None, 1 147456      average_pooling2d_5[0][0]        
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, None, None, 1 576         conv2d_50[0][0]                  
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, None, None, 1 576         conv2d_53[0][0]                  
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, None, None, 1 576         conv2d_58[0][0]                  
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, None, None, 1 576         conv2d_59[0][0]                  
__________________________________________________________________________________________________
activation_50 (Activation)      (None, None, None, 1 0           batch_normalization_50[0][0]     
__________________________________________________________________________________________________
activation_53 (Activation)      (None, None, None, 1 0           batch_normalization_53[0][0]     
__________________________________________________________________________________________________
activation_58 (Activation)      (None, None, None, 1 0           batch_normalization_58[0][0]     
__________________________________________________________________________________________________
activation_59 (Activation)      (None, None, None, 1 0           batch_normalization_59[0][0]     
__________________________________________________________________________________________________
mixed6 (Concatenate)            (None, None, None, 7 0           activation_50[0][0]              
                                                                 activation_53[0][0]              
                                                                 activation_58[0][0]              
                                                                 activation_59[0][0]              
__________________________________________________________________________________________________
conv2d_64 (Conv2D)              (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, None, None, 1 576         conv2d_64[0][0]                  
__________________________________________________________________________________________________
activation_64 (Activation)      (None, None, None, 1 0           batch_normalization_64[0][0]     
__________________________________________________________________________________________________
conv2d_65 (Conv2D)              (None, None, None, 1 258048      activation_64[0][0]              
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, None, None, 1 576         conv2d_65[0][0]                  
__________________________________________________________________________________________________
activation_65 (Activation)      (None, None, None, 1 0           batch_normalization_65[0][0]     
__________________________________________________________________________________________________
conv2d_61 (Conv2D)              (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_66 (Conv2D)              (None, None, None, 1 258048      activation_65[0][0]              
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, None, None, 1 576         conv2d_61[0][0]                  
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, None, None, 1 576         conv2d_66[0][0]                  
__________________________________________________________________________________________________
activation_61 (Activation)      (None, None, None, 1 0           batch_normalization_61[0][0]     
__________________________________________________________________________________________________
activation_66 (Activation)      (None, None, None, 1 0           batch_normalization_66[0][0]     
__________________________________________________________________________________________________
conv2d_62 (Conv2D)              (None, None, None, 1 258048      activation_61[0][0]              
__________________________________________________________________________________________________
conv2d_67 (Conv2D)              (None, None, None, 1 258048      activation_66[0][0]              
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, None, None, 1 576         conv2d_62[0][0]                  
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, None, None, 1 576         conv2d_67[0][0]                  
__________________________________________________________________________________________________
activation_62 (Activation)      (None, None, None, 1 0           batch_normalization_62[0][0]     
__________________________________________________________________________________________________
activation_67 (Activation)      (None, None, None, 1 0           batch_normalization_67[0][0]     
__________________________________________________________________________________________________
average_pooling2d_6 (AveragePoo (None, None, None, 7 0           mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_60 (Conv2D)              (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_63 (Conv2D)              (None, None, None, 1 258048      activation_62[0][0]              
__________________________________________________________________________________________________
conv2d_68 (Conv2D)              (None, None, None, 1 258048      activation_67[0][0]              
__________________________________________________________________________________________________
conv2d_69 (Conv2D)              (None, None, None, 1 147456      average_pooling2d_6[0][0]        
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, None, None, 1 576         conv2d_60[0][0]                  
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, None, None, 1 576         conv2d_63[0][0]                  
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, None, None, 1 576         conv2d_68[0][0]                  
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, None, None, 1 576         conv2d_69[0][0]                  
__________________________________________________________________________________________________
activation_60 (Activation)      (None, None, None, 1 0           batch_normalization_60[0][0]     
__________________________________________________________________________________________________
activation_63 (Activation)      (None, None, None, 1 0           batch_normalization_63[0][0]     
__________________________________________________________________________________________________
activation_68 (Activation)      (None, None, None, 1 0           batch_normalization_68[0][0]     
__________________________________________________________________________________________________
activation_69 (Activation)      (None, None, None, 1 0           batch_normalization_69[0][0]     
__________________________________________________________________________________________________
mixed7 (Concatenate)            (None, None, None, 7 0           activation_60[0][0]              
                                                                 activation_63[0][0]              
                                                                 activation_68[0][0]              
                                                                 activation_69[0][0]              
__________________________________________________________________________________________________
conv2d_72 (Conv2D)              (None, None, None, 1 147456      mixed7[0][0]                     
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, None, None, 1 576         conv2d_72[0][0]                  
__________________________________________________________________________________________________
activation_72 (Activation)      (None, None, None, 1 0           batch_normalization_72[0][0]     
__________________________________________________________________________________________________
conv2d_73 (Conv2D)              (None, None, None, 1 258048      activation_72[0][0]              
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, None, None, 1 576         conv2d_73[0][0]                  
__________________________________________________________________________________________________
activation_73 (Activation)      (None, None, None, 1 0           batch_normalization_73[0][0]     
__________________________________________________________________________________________________
conv2d_70 (Conv2D)              (None, None, None, 1 147456      mixed7[0][0]                     
__________________________________________________________________________________________________
conv2d_74 (Conv2D)              (None, None, None, 1 258048      activation_73[0][0]              
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, None, None, 1 576         conv2d_70[0][0]                  
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, None, None, 1 576         conv2d_74[0][0]                  
__________________________________________________________________________________________________
activation_70 (Activation)      (None, None, None, 1 0           batch_normalization_70[0][0]     
__________________________________________________________________________________________________
activation_74 (Activation)      (None, None, None, 1 0           batch_normalization_74[0][0]     
__________________________________________________________________________________________________
conv2d_71 (Conv2D)              (None, None, None, 3 552960      activation_70[0][0]              
__________________________________________________________________________________________________
conv2d_75 (Conv2D)              (None, None, None, 1 331776      activation_74[0][0]              
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, None, None, 3 960         conv2d_71[0][0]                  
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, None, None, 1 576         conv2d_75[0][0]                  
__________________________________________________________________________________________________
activation_71 (Activation)      (None, None, None, 3 0           batch_normalization_71[0][0]     
__________________________________________________________________________________________________
activation_75 (Activation)      (None, None, None, 1 0           batch_normalization_75[0][0]     
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D)  (None, None, None, 7 0           mixed7[0][0]                     
__________________________________________________________________________________________________
mixed8 (Concatenate)            (None, None, None, 1 0           activation_71[0][0]              
                                                                 activation_75[0][0]              
                                                                 max_pooling2d_3[0][0]            
__________________________________________________________________________________________________
conv2d_80 (Conv2D)              (None, None, None, 4 573440      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, None, None, 4 1344        conv2d_80[0][0]                  
__________________________________________________________________________________________________
activation_80 (Activation)      (None, None, None, 4 0           batch_normalization_80[0][0]     
__________________________________________________________________________________________________
conv2d_77 (Conv2D)              (None, None, None, 3 491520      mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_81 (Conv2D)              (None, None, None, 3 1548288     activation_80[0][0]              
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, None, None, 3 1152        conv2d_77[0][0]                  
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, None, None, 3 1152        conv2d_81[0][0]                  
__________________________________________________________________________________________________
activation_77 (Activation)      (None, None, None, 3 0           batch_normalization_77[0][0]     
__________________________________________________________________________________________________
activation_81 (Activation)      (None, None, None, 3 0           batch_normalization_81[0][0]     
__________________________________________________________________________________________________
conv2d_78 (Conv2D)              (None, None, None, 3 442368      activation_77[0][0]              
__________________________________________________________________________________________________
conv2d_79 (Conv2D)              (None, None, None, 3 442368      activation_77[0][0]              
__________________________________________________________________________________________________
conv2d_82 (Conv2D)              (None, None, None, 3 442368      activation_81[0][0]              
__________________________________________________________________________________________________
conv2d_83 (Conv2D)              (None, None, None, 3 442368      activation_81[0][0]              
__________________________________________________________________________________________________
average_pooling2d_7 (AveragePoo (None, None, None, 1 0           mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_76 (Conv2D)              (None, None, None, 3 409600      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, None, None, 3 1152        conv2d_78[0][0]                  
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, None, None, 3 1152        conv2d_79[0][0]                  
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, None, None, 3 1152        conv2d_82[0][0]                  
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, None, None, 3 1152        conv2d_83[0][0]                  
__________________________________________________________________________________________________
conv2d_84 (Conv2D)              (None, None, None, 1 245760      average_pooling2d_7[0][0]        
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, None, None, 3 960         conv2d_76[0][0]                  
__________________________________________________________________________________________________
activation_78 (Activation)      (None, None, None, 3 0           batch_normalization_78[0][0]     
__________________________________________________________________________________________________
activation_79 (Activation)      (None, None, None, 3 0           batch_normalization_79[0][0]     
__________________________________________________________________________________________________
activation_82 (Activation)      (None, None, None, 3 0           batch_normalization_82[0][0]     
__________________________________________________________________________________________________
activation_83 (Activation)      (None, None, None, 3 0           batch_normalization_83[0][0]     
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, None, None, 1 576         conv2d_84[0][0]                  
__________________________________________________________________________________________________
activation_76 (Activation)      (None, None, None, 3 0           batch_normalization_76[0][0]     
__________________________________________________________________________________________________
mixed9_0 (Concatenate)          (None, None, None, 7 0           activation_78[0][0]              
                                                                 activation_79[0][0]              
__________________________________________________________________________________________________
concatenate (Concatenate)       (None, None, None, 7 0           activation_82[0][0]              
                                                                 activation_83[0][0]              
__________________________________________________________________________________________________
activation_84 (Activation)      (None, None, None, 1 0           batch_normalization_84[0][0]     
__________________________________________________________________________________________________
mixed9 (Concatenate)            (None, None, None, 2 0           activation_76[0][0]              
                                                                 mixed9_0[0][0]                   
                                                                 concatenate[0][0]                
                                                                 activation_84[0][0]              
__________________________________________________________________________________________________
conv2d_89 (Conv2D)              (None, None, None, 4 917504      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, None, None, 4 1344        conv2d_89[0][0]                  
__________________________________________________________________________________________________
activation_89 (Activation)      (None, None, None, 4 0           batch_normalization_89[0][0]     
__________________________________________________________________________________________________
conv2d_86 (Conv2D)              (None, None, None, 3 786432      mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_90 (Conv2D)              (None, None, None, 3 1548288     activation_89[0][0]              
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, None, None, 3 1152        conv2d_86[0][0]                  
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, None, None, 3 1152        conv2d_90[0][0]                  
__________________________________________________________________________________________________
activation_86 (Activation)      (None, None, None, 3 0           batch_normalization_86[0][0]     
__________________________________________________________________________________________________
activation_90 (Activation)      (None, None, None, 3 0           batch_normalization_90[0][0]     
__________________________________________________________________________________________________
conv2d_87 (Conv2D)              (None, None, None, 3 442368      activation_86[0][0]              
__________________________________________________________________________________________________
conv2d_88 (Conv2D)              (None, None, None, 3 442368      activation_86[0][0]              
__________________________________________________________________________________________________
conv2d_91 (Conv2D)              (None, None, None, 3 442368      activation_90[0][0]              
__________________________________________________________________________________________________
conv2d_92 (Conv2D)              (None, None, None, 3 442368      activation_90[0][0]              
__________________________________________________________________________________________________
average_pooling2d_8 (AveragePoo (None, None, None, 2 0           mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_85 (Conv2D)              (None, None, None, 3 655360      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, None, None, 3 1152        conv2d_87[0][0]                  
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, None, None, 3 1152        conv2d_88[0][0]                  
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, None, None, 3 1152        conv2d_91[0][0]                  
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, None, None, 3 1152        conv2d_92[0][0]                  
__________________________________________________________________________________________________
conv2d_93 (Conv2D)              (None, None, None, 1 393216      average_pooling2d_8[0][0]        
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, None, None, 3 960         conv2d_85[0][0]                  
__________________________________________________________________________________________________
activation_87 (Activation)      (None, None, None, 3 0           batch_normalization_87[0][0]     
__________________________________________________________________________________________________
activation_88 (Activation)      (None, None, None, 3 0           batch_normalization_88[0][0]     
__________________________________________________________________________________________________
activation_91 (Activation)      (None, None, None, 3 0           batch_normalization_91[0][0]     
__________________________________________________________________________________________________
activation_92 (Activation)      (None, None, None, 3 0           batch_normalization_92[0][0]     
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, None, None, 1 576         conv2d_93[0][0]                  
__________________________________________________________________________________________________
activation_85 (Activation)      (None, None, None, 3 0           batch_normalization_85[0][0]     
__________________________________________________________________________________________________
mixed9_1 (Concatenate)          (None, None, None, 7 0           activation_87[0][0]              
                                                                 activation_88[0][0]              
__________________________________________________________________________________________________
concatenate_1 (Concatenate)     (None, None, None, 7 0           activation_91[0][0]              
                                                                 activation_92[0][0]              
__________________________________________________________________________________________________
activation_93 (Activation)      (None, None, None, 1 0           batch_normalization_93[0][0]     
__________________________________________________________________________________________________
mixed10 (Concatenate)           (None, None, None, 2 0           activation_85[0][0]              
                                                                 mixed9_1[0][0]                   
                                                                 concatenate_1[0][0]              
                                                                 activation_93[0][0]              
==================================================================================================
Total params: 21,802,784
Trainable params: 21,768,352
Non-trainable params: 34,432
__________________________________________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
271.23355174064636
In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')
base_model.summary()
Model: "inception_v3"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_2 (InputLayer)            [(None, None, None,  0                                            
__________________________________________________________________________________________________
conv2d_94 (Conv2D)              (None, None, None, 3 864         input_2[0][0]                    
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, None, None, 3 96          conv2d_94[0][0]                  
__________________________________________________________________________________________________
activation_94 (Activation)      (None, None, None, 3 0           batch_normalization_94[0][0]     
__________________________________________________________________________________________________
conv2d_95 (Conv2D)              (None, None, None, 3 9216        activation_94[0][0]              
__________________________________________________________________________________________________
batch_normalization_95 (BatchNo (None, None, None, 3 96          conv2d_95[0][0]                  
__________________________________________________________________________________________________
activation_95 (Activation)      (None, None, None, 3 0           batch_normalization_95[0][0]     
__________________________________________________________________________________________________
conv2d_96 (Conv2D)              (None, None, None, 6 18432       activation_95[0][0]              
__________________________________________________________________________________________________
batch_normalization_96 (BatchNo (None, None, None, 6 192         conv2d_96[0][0]                  
__________________________________________________________________________________________________
activation_96 (Activation)      (None, None, None, 6 0           batch_normalization_96[0][0]     
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D)  (None, None, None, 6 0           activation_96[0][0]              
__________________________________________________________________________________________________
conv2d_97 (Conv2D)              (None, None, None, 8 5120        max_pooling2d_4[0][0]            
__________________________________________________________________________________________________
batch_normalization_97 (BatchNo (None, None, None, 8 240         conv2d_97[0][0]                  
__________________________________________________________________________________________________
activation_97 (Activation)      (None, None, None, 8 0           batch_normalization_97[0][0]     
__________________________________________________________________________________________________
conv2d_98 (Conv2D)              (None, None, None, 1 138240      activation_97[0][0]              
__________________________________________________________________________________________________
batch_normalization_98 (BatchNo (None, None, None, 1 576         conv2d_98[0][0]                  
__________________________________________________________________________________________________
activation_98 (Activation)      (None, None, None, 1 0           batch_normalization_98[0][0]     
__________________________________________________________________________________________________
max_pooling2d_5 (MaxPooling2D)  (None, None, None, 1 0           activation_98[0][0]              
__________________________________________________________________________________________________
conv2d_102 (Conv2D)             (None, None, None, 6 12288       max_pooling2d_5[0][0]            
__________________________________________________________________________________________________
batch_normalization_102 (BatchN (None, None, None, 6 192         conv2d_102[0][0]                 
__________________________________________________________________________________________________
activation_102 (Activation)     (None, None, None, 6 0           batch_normalization_102[0][0]    
__________________________________________________________________________________________________
conv2d_100 (Conv2D)             (None, None, None, 4 9216        max_pooling2d_5[0][0]            
__________________________________________________________________________________________________
conv2d_103 (Conv2D)             (None, None, None, 9 55296       activation_102[0][0]             
__________________________________________________________________________________________________
batch_normalization_100 (BatchN (None, None, None, 4 144         conv2d_100[0][0]                 
__________________________________________________________________________________________________
batch_normalization_103 (BatchN (None, None, None, 9 288         conv2d_103[0][0]                 
__________________________________________________________________________________________________
activation_100 (Activation)     (None, None, None, 4 0           batch_normalization_100[0][0]    
__________________________________________________________________________________________________
activation_103 (Activation)     (None, None, None, 9 0           batch_normalization_103[0][0]    
__________________________________________________________________________________________________
average_pooling2d_9 (AveragePoo (None, None, None, 1 0           max_pooling2d_5[0][0]            
__________________________________________________________________________________________________
conv2d_99 (Conv2D)              (None, None, None, 6 12288       max_pooling2d_5[0][0]            
__________________________________________________________________________________________________
conv2d_101 (Conv2D)             (None, None, None, 6 76800       activation_100[0][0]             
__________________________________________________________________________________________________
conv2d_104 (Conv2D)             (None, None, None, 9 82944       activation_103[0][0]             
__________________________________________________________________________________________________
conv2d_105 (Conv2D)             (None, None, None, 3 6144        average_pooling2d_9[0][0]        
__________________________________________________________________________________________________
batch_normalization_99 (BatchNo (None, None, None, 6 192         conv2d_99[0][0]                  
__________________________________________________________________________________________________
batch_normalization_101 (BatchN (None, None, None, 6 192         conv2d_101[0][0]                 
__________________________________________________________________________________________________
batch_normalization_104 (BatchN (None, None, None, 9 288         conv2d_104[0][0]                 
__________________________________________________________________________________________________
batch_normalization_105 (BatchN (None, None, None, 3 96          conv2d_105[0][0]                 
__________________________________________________________________________________________________
activation_99 (Activation)      (None, None, None, 6 0           batch_normalization_99[0][0]     
__________________________________________________________________________________________________
activation_101 (Activation)     (None, None, None, 6 0           batch_normalization_101[0][0]    
__________________________________________________________________________________________________
activation_104 (Activation)     (None, None, None, 9 0           batch_normalization_104[0][0]    
__________________________________________________________________________________________________
activation_105 (Activation)     (None, None, None, 3 0           batch_normalization_105[0][0]    
__________________________________________________________________________________________________
mixed0 (Concatenate)            (None, None, None, 2 0           activation_99[0][0]              
                                                                 activation_101[0][0]             
                                                                 activation_104[0][0]             
                                                                 activation_105[0][0]             
__________________________________________________________________________________________________
conv2d_109 (Conv2D)             (None, None, None, 6 16384       mixed0[0][0]                     
__________________________________________________________________________________________________
batch_normalization_109 (BatchN (None, None, None, 6 192         conv2d_109[0][0]                 
__________________________________________________________________________________________________
activation_109 (Activation)     (None, None, None, 6 0           batch_normalization_109[0][0]    
__________________________________________________________________________________________________
conv2d_107 (Conv2D)             (None, None, None, 4 12288       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_110 (Conv2D)             (None, None, None, 9 55296       activation_109[0][0]             
__________________________________________________________________________________________________
batch_normalization_107 (BatchN (None, None, None, 4 144         conv2d_107[0][0]                 
__________________________________________________________________________________________________
batch_normalization_110 (BatchN (None, None, None, 9 288         conv2d_110[0][0]                 
__________________________________________________________________________________________________
activation_107 (Activation)     (None, None, None, 4 0           batch_normalization_107[0][0]    
__________________________________________________________________________________________________
activation_110 (Activation)     (None, None, None, 9 0           batch_normalization_110[0][0]    
__________________________________________________________________________________________________
average_pooling2d_10 (AveragePo (None, None, None, 2 0           mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_106 (Conv2D)             (None, None, None, 6 16384       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_108 (Conv2D)             (None, None, None, 6 76800       activation_107[0][0]             
__________________________________________________________________________________________________
conv2d_111 (Conv2D)             (None, None, None, 9 82944       activation_110[0][0]             
__________________________________________________________________________________________________
conv2d_112 (Conv2D)             (None, None, None, 6 16384       average_pooling2d_10[0][0]       
__________________________________________________________________________________________________
batch_normalization_106 (BatchN (None, None, None, 6 192         conv2d_106[0][0]                 
__________________________________________________________________________________________________
batch_normalization_108 (BatchN (None, None, None, 6 192         conv2d_108[0][0]                 
__________________________________________________________________________________________________
batch_normalization_111 (BatchN (None, None, None, 9 288         conv2d_111[0][0]                 
__________________________________________________________________________________________________
batch_normalization_112 (BatchN (None, None, None, 6 192         conv2d_112[0][0]                 
__________________________________________________________________________________________________
activation_106 (Activation)     (None, None, None, 6 0           batch_normalization_106[0][0]    
__________________________________________________________________________________________________
activation_108 (Activation)     (None, None, None, 6 0           batch_normalization_108[0][0]    
__________________________________________________________________________________________________
activation_111 (Activation)     (None, None, None, 9 0           batch_normalization_111[0][0]    
__________________________________________________________________________________________________
activation_112 (Activation)     (None, None, None, 6 0           batch_normalization_112[0][0]    
__________________________________________________________________________________________________
mixed1 (Concatenate)            (None, None, None, 2 0           activation_106[0][0]             
                                                                 activation_108[0][0]             
                                                                 activation_111[0][0]             
                                                                 activation_112[0][0]             
__________________________________________________________________________________________________
conv2d_116 (Conv2D)             (None, None, None, 6 18432       mixed1[0][0]                     
__________________________________________________________________________________________________
batch_normalization_116 (BatchN (None, None, None, 6 192         conv2d_116[0][0]                 
__________________________________________________________________________________________________
activation_116 (Activation)     (None, None, None, 6 0           batch_normalization_116[0][0]    
__________________________________________________________________________________________________
conv2d_114 (Conv2D)             (None, None, None, 4 13824       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_117 (Conv2D)             (None, None, None, 9 55296       activation_116[0][0]             
__________________________________________________________________________________________________
batch_normalization_114 (BatchN (None, None, None, 4 144         conv2d_114[0][0]                 
__________________________________________________________________________________________________
batch_normalization_117 (BatchN (None, None, None, 9 288         conv2d_117[0][0]                 
__________________________________________________________________________________________________
activation_114 (Activation)     (None, None, None, 4 0           batch_normalization_114[0][0]    
__________________________________________________________________________________________________
activation_117 (Activation)     (None, None, None, 9 0           batch_normalization_117[0][0]    
__________________________________________________________________________________________________
average_pooling2d_11 (AveragePo (None, None, None, 2 0           mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_113 (Conv2D)             (None, None, None, 6 18432       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_115 (Conv2D)             (None, None, None, 6 76800       activation_114[0][0]             
__________________________________________________________________________________________________
conv2d_118 (Conv2D)             (None, None, None, 9 82944       activation_117[0][0]             
__________________________________________________________________________________________________
conv2d_119 (Conv2D)             (None, None, None, 6 18432       average_pooling2d_11[0][0]       
__________________________________________________________________________________________________
batch_normalization_113 (BatchN (None, None, None, 6 192         conv2d_113[0][0]                 
__________________________________________________________________________________________________
batch_normalization_115 (BatchN (None, None, None, 6 192         conv2d_115[0][0]                 
__________________________________________________________________________________________________
batch_normalization_118 (BatchN (None, None, None, 9 288         conv2d_118[0][0]                 
__________________________________________________________________________________________________
batch_normalization_119 (BatchN (None, None, None, 6 192         conv2d_119[0][0]                 
__________________________________________________________________________________________________
activation_113 (Activation)     (None, None, None, 6 0           batch_normalization_113[0][0]    
__________________________________________________________________________________________________
activation_115 (Activation)     (None, None, None, 6 0           batch_normalization_115[0][0]    
__________________________________________________________________________________________________
activation_118 (Activation)     (None, None, None, 9 0           batch_normalization_118[0][0]    
__________________________________________________________________________________________________
activation_119 (Activation)     (None, None, None, 6 0           batch_normalization_119[0][0]    
__________________________________________________________________________________________________
mixed2 (Concatenate)            (None, None, None, 2 0           activation_113[0][0]             
                                                                 activation_115[0][0]             
                                                                 activation_118[0][0]             
                                                                 activation_119[0][0]             
__________________________________________________________________________________________________
conv2d_121 (Conv2D)             (None, None, None, 6 18432       mixed2[0][0]                     
__________________________________________________________________________________________________
batch_normalization_121 (BatchN (None, None, None, 6 192         conv2d_121[0][0]                 
__________________________________________________________________________________________________
activation_121 (Activation)     (None, None, None, 6 0           batch_normalization_121[0][0]    
__________________________________________________________________________________________________
conv2d_122 (Conv2D)             (None, None, None, 9 55296       activation_121[0][0]             
__________________________________________________________________________________________________
batch_normalization_122 (BatchN (None, None, None, 9 288         conv2d_122[0][0]                 
__________________________________________________________________________________________________
activation_122 (Activation)     (None, None, None, 9 0           batch_normalization_122[0][0]    
__________________________________________________________________________________________________
conv2d_120 (Conv2D)             (None, None, None, 3 995328      mixed2[0][0]                     
__________________________________________________________________________________________________
conv2d_123 (Conv2D)             (None, None, None, 9 82944       activation_122[0][0]             
__________________________________________________________________________________________________
batch_normalization_120 (BatchN (None, None, None, 3 1152        conv2d_120[0][0]                 
__________________________________________________________________________________________________
batch_normalization_123 (BatchN (None, None, None, 9 288         conv2d_123[0][0]                 
__________________________________________________________________________________________________
activation_120 (Activation)     (None, None, None, 3 0           batch_normalization_120[0][0]    
__________________________________________________________________________________________________
activation_123 (Activation)     (None, None, None, 9 0           batch_normalization_123[0][0]    
__________________________________________________________________________________________________
max_pooling2d_6 (MaxPooling2D)  (None, None, None, 2 0           mixed2[0][0]                     
__________________________________________________________________________________________________
mixed3 (Concatenate)            (None, None, None, 7 0           activation_120[0][0]             
                                                                 activation_123[0][0]             
                                                                 max_pooling2d_6[0][0]            
__________________________________________________________________________________________________
conv2d_128 (Conv2D)             (None, None, None, 1 98304       mixed3[0][0]                     
__________________________________________________________________________________________________
batch_normalization_128 (BatchN (None, None, None, 1 384         conv2d_128[0][0]                 
__________________________________________________________________________________________________
activation_128 (Activation)     (None, None, None, 1 0           batch_normalization_128[0][0]    
__________________________________________________________________________________________________
conv2d_129 (Conv2D)             (None, None, None, 1 114688      activation_128[0][0]             
__________________________________________________________________________________________________
batch_normalization_129 (BatchN (None, None, None, 1 384         conv2d_129[0][0]                 
__________________________________________________________________________________________________
activation_129 (Activation)     (None, None, None, 1 0           batch_normalization_129[0][0]    
__________________________________________________________________________________________________
conv2d_125 (Conv2D)             (None, None, None, 1 98304       mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_130 (Conv2D)             (None, None, None, 1 114688      activation_129[0][0]             
__________________________________________________________________________________________________
batch_normalization_125 (BatchN (None, None, None, 1 384         conv2d_125[0][0]                 
__________________________________________________________________________________________________
batch_normalization_130 (BatchN (None, None, None, 1 384         conv2d_130[0][0]                 
__________________________________________________________________________________________________
activation_125 (Activation)     (None, None, None, 1 0           batch_normalization_125[0][0]    
__________________________________________________________________________________________________
activation_130 (Activation)     (None, None, None, 1 0           batch_normalization_130[0][0]    
__________________________________________________________________________________________________
conv2d_126 (Conv2D)             (None, None, None, 1 114688      activation_125[0][0]             
__________________________________________________________________________________________________
conv2d_131 (Conv2D)             (None, None, None, 1 114688      activation_130[0][0]             
__________________________________________________________________________________________________
batch_normalization_126 (BatchN (None, None, None, 1 384         conv2d_126[0][0]                 
__________________________________________________________________________________________________
batch_normalization_131 (BatchN (None, None, None, 1 384         conv2d_131[0][0]                 
__________________________________________________________________________________________________
activation_126 (Activation)     (None, None, None, 1 0           batch_normalization_126[0][0]    
__________________________________________________________________________________________________
activation_131 (Activation)     (None, None, None, 1 0           batch_normalization_131[0][0]    
__________________________________________________________________________________________________
average_pooling2d_12 (AveragePo (None, None, None, 7 0           mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_124 (Conv2D)             (None, None, None, 1 147456      mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_127 (Conv2D)             (None, None, None, 1 172032      activation_126[0][0]             
__________________________________________________________________________________________________
conv2d_132 (Conv2D)             (None, None, None, 1 172032      activation_131[0][0]             
__________________________________________________________________________________________________
conv2d_133 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_12[0][0]       
__________________________________________________________________________________________________
batch_normalization_124 (BatchN (None, None, None, 1 576         conv2d_124[0][0]                 
__________________________________________________________________________________________________
batch_normalization_127 (BatchN (None, None, None, 1 576         conv2d_127[0][0]                 
__________________________________________________________________________________________________
batch_normalization_132 (BatchN (None, None, None, 1 576         conv2d_132[0][0]                 
__________________________________________________________________________________________________
batch_normalization_133 (BatchN (None, None, None, 1 576         conv2d_133[0][0]                 
__________________________________________________________________________________________________
activation_124 (Activation)     (None, None, None, 1 0           batch_normalization_124[0][0]    
__________________________________________________________________________________________________
activation_127 (Activation)     (None, None, None, 1 0           batch_normalization_127[0][0]    
__________________________________________________________________________________________________
activation_132 (Activation)     (None, None, None, 1 0           batch_normalization_132[0][0]    
__________________________________________________________________________________________________
activation_133 (Activation)     (None, None, None, 1 0           batch_normalization_133[0][0]    
__________________________________________________________________________________________________
mixed4 (Concatenate)            (None, None, None, 7 0           activation_124[0][0]             
                                                                 activation_127[0][0]             
                                                                 activation_132[0][0]             
                                                                 activation_133[0][0]             
__________________________________________________________________________________________________
conv2d_138 (Conv2D)             (None, None, None, 1 122880      mixed4[0][0]                     
__________________________________________________________________________________________________
batch_normalization_138 (BatchN (None, None, None, 1 480         conv2d_138[0][0]                 
__________________________________________________________________________________________________
activation_138 (Activation)     (None, None, None, 1 0           batch_normalization_138[0][0]    
__________________________________________________________________________________________________
conv2d_139 (Conv2D)             (None, None, None, 1 179200      activation_138[0][0]             
__________________________________________________________________________________________________
batch_normalization_139 (BatchN (None, None, None, 1 480         conv2d_139[0][0]                 
__________________________________________________________________________________________________
activation_139 (Activation)     (None, None, None, 1 0           batch_normalization_139[0][0]    
__________________________________________________________________________________________________
conv2d_135 (Conv2D)             (None, None, None, 1 122880      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_140 (Conv2D)             (None, None, None, 1 179200      activation_139[0][0]             
__________________________________________________________________________________________________
batch_normalization_135 (BatchN (None, None, None, 1 480         conv2d_135[0][0]                 
__________________________________________________________________________________________________
batch_normalization_140 (BatchN (None, None, None, 1 480         conv2d_140[0][0]                 
__________________________________________________________________________________________________
activation_135 (Activation)     (None, None, None, 1 0           batch_normalization_135[0][0]    
__________________________________________________________________________________________________
activation_140 (Activation)     (None, None, None, 1 0           batch_normalization_140[0][0]    
__________________________________________________________________________________________________
conv2d_136 (Conv2D)             (None, None, None, 1 179200      activation_135[0][0]             
__________________________________________________________________________________________________
conv2d_141 (Conv2D)             (None, None, None, 1 179200      activation_140[0][0]             
__________________________________________________________________________________________________
batch_normalization_136 (BatchN (None, None, None, 1 480         conv2d_136[0][0]                 
__________________________________________________________________________________________________
batch_normalization_141 (BatchN (None, None, None, 1 480         conv2d_141[0][0]                 
__________________________________________________________________________________________________
activation_136 (Activation)     (None, None, None, 1 0           batch_normalization_136[0][0]    
__________________________________________________________________________________________________
activation_141 (Activation)     (None, None, None, 1 0           batch_normalization_141[0][0]    
__________________________________________________________________________________________________
average_pooling2d_13 (AveragePo (None, None, None, 7 0           mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_134 (Conv2D)             (None, None, None, 1 147456      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_137 (Conv2D)             (None, None, None, 1 215040      activation_136[0][0]             
__________________________________________________________________________________________________
conv2d_142 (Conv2D)             (None, None, None, 1 215040      activation_141[0][0]             
__________________________________________________________________________________________________
conv2d_143 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_13[0][0]       
__________________________________________________________________________________________________
batch_normalization_134 (BatchN (None, None, None, 1 576         conv2d_134[0][0]                 
__________________________________________________________________________________________________
batch_normalization_137 (BatchN (None, None, None, 1 576         conv2d_137[0][0]                 
__________________________________________________________________________________________________
batch_normalization_142 (BatchN (None, None, None, 1 576         conv2d_142[0][0]                 
__________________________________________________________________________________________________
batch_normalization_143 (BatchN (None, None, None, 1 576         conv2d_143[0][0]                 
__________________________________________________________________________________________________
activation_134 (Activation)     (None, None, None, 1 0           batch_normalization_134[0][0]    
__________________________________________________________________________________________________
activation_137 (Activation)     (None, None, None, 1 0           batch_normalization_137[0][0]    
__________________________________________________________________________________________________
activation_142 (Activation)     (None, None, None, 1 0           batch_normalization_142[0][0]    
__________________________________________________________________________________________________
activation_143 (Activation)     (None, None, None, 1 0           batch_normalization_143[0][0]    
__________________________________________________________________________________________________
mixed5 (Concatenate)            (None, None, None, 7 0           activation_134[0][0]             
                                                                 activation_137[0][0]             
                                                                 activation_142[0][0]             
                                                                 activation_143[0][0]             
__________________________________________________________________________________________________
conv2d_148 (Conv2D)             (None, None, None, 1 122880      mixed5[0][0]                     
__________________________________________________________________________________________________
batch_normalization_148 (BatchN (None, None, None, 1 480         conv2d_148[0][0]                 
__________________________________________________________________________________________________
activation_148 (Activation)     (None, None, None, 1 0           batch_normalization_148[0][0]    
__________________________________________________________________________________________________
conv2d_149 (Conv2D)             (None, None, None, 1 179200      activation_148[0][0]             
__________________________________________________________________________________________________
batch_normalization_149 (BatchN (None, None, None, 1 480         conv2d_149[0][0]                 
__________________________________________________________________________________________________
activation_149 (Activation)     (None, None, None, 1 0           batch_normalization_149[0][0]    
__________________________________________________________________________________________________
conv2d_145 (Conv2D)             (None, None, None, 1 122880      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_150 (Conv2D)             (None, None, None, 1 179200      activation_149[0][0]             
__________________________________________________________________________________________________
batch_normalization_145 (BatchN (None, None, None, 1 480         conv2d_145[0][0]                 
__________________________________________________________________________________________________
batch_normalization_150 (BatchN (None, None, None, 1 480         conv2d_150[0][0]                 
__________________________________________________________________________________________________
activation_145 (Activation)     (None, None, None, 1 0           batch_normalization_145[0][0]    
__________________________________________________________________________________________________
activation_150 (Activation)     (None, None, None, 1 0           batch_normalization_150[0][0]    
__________________________________________________________________________________________________
conv2d_146 (Conv2D)             (None, None, None, 1 179200      activation_145[0][0]             
__________________________________________________________________________________________________
conv2d_151 (Conv2D)             (None, None, None, 1 179200      activation_150[0][0]             
__________________________________________________________________________________________________
batch_normalization_146 (BatchN (None, None, None, 1 480         conv2d_146[0][0]                 
__________________________________________________________________________________________________
batch_normalization_151 (BatchN (None, None, None, 1 480         conv2d_151[0][0]                 
__________________________________________________________________________________________________
activation_146 (Activation)     (None, None, None, 1 0           batch_normalization_146[0][0]    
__________________________________________________________________________________________________
activation_151 (Activation)     (None, None, None, 1 0           batch_normalization_151[0][0]    
__________________________________________________________________________________________________
average_pooling2d_14 (AveragePo (None, None, None, 7 0           mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_144 (Conv2D)             (None, None, None, 1 147456      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_147 (Conv2D)             (None, None, None, 1 215040      activation_146[0][0]             
__________________________________________________________________________________________________
conv2d_152 (Conv2D)             (None, None, None, 1 215040      activation_151[0][0]             
__________________________________________________________________________________________________
conv2d_153 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_14[0][0]       
__________________________________________________________________________________________________
batch_normalization_144 (BatchN (None, None, None, 1 576         conv2d_144[0][0]                 
__________________________________________________________________________________________________
batch_normalization_147 (BatchN (None, None, None, 1 576         conv2d_147[0][0]                 
__________________________________________________________________________________________________
batch_normalization_152 (BatchN (None, None, None, 1 576         conv2d_152[0][0]                 
__________________________________________________________________________________________________
batch_normalization_153 (BatchN (None, None, None, 1 576         conv2d_153[0][0]                 
__________________________________________________________________________________________________
activation_144 (Activation)     (None, None, None, 1 0           batch_normalization_144[0][0]    
__________________________________________________________________________________________________
activation_147 (Activation)     (None, None, None, 1 0           batch_normalization_147[0][0]    
__________________________________________________________________________________________________
activation_152 (Activation)     (None, None, None, 1 0           batch_normalization_152[0][0]    
__________________________________________________________________________________________________
activation_153 (Activation)     (None, None, None, 1 0           batch_normalization_153[0][0]    
__________________________________________________________________________________________________
mixed6 (Concatenate)            (None, None, None, 7 0           activation_144[0][0]             
                                                                 activation_147[0][0]             
                                                                 activation_152[0][0]             
                                                                 activation_153[0][0]             
__________________________________________________________________________________________________
conv2d_158 (Conv2D)             (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
batch_normalization_158 (BatchN (None, None, None, 1 576         conv2d_158[0][0]                 
__________________________________________________________________________________________________
activation_158 (Activation)     (None, None, None, 1 0           batch_normalization_158[0][0]    
__________________________________________________________________________________________________
conv2d_159 (Conv2D)             (None, None, None, 1 258048      activation_158[0][0]             
__________________________________________________________________________________________________
batch_normalization_159 (BatchN (None, None, None, 1 576         conv2d_159[0][0]                 
__________________________________________________________________________________________________
activation_159 (Activation)     (None, None, None, 1 0           batch_normalization_159[0][0]    
__________________________________________________________________________________________________
conv2d_155 (Conv2D)             (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_160 (Conv2D)             (None, None, None, 1 258048      activation_159[0][0]             
__________________________________________________________________________________________________
batch_normalization_155 (BatchN (None, None, None, 1 576         conv2d_155[0][0]                 
__________________________________________________________________________________________________
batch_normalization_160 (BatchN (None, None, None, 1 576         conv2d_160[0][0]                 
__________________________________________________________________________________________________
activation_155 (Activation)     (None, None, None, 1 0           batch_normalization_155[0][0]    
__________________________________________________________________________________________________
activation_160 (Activation)     (None, None, None, 1 0           batch_normalization_160[0][0]    
__________________________________________________________________________________________________
conv2d_156 (Conv2D)             (None, None, None, 1 258048      activation_155[0][0]             
__________________________________________________________________________________________________
conv2d_161 (Conv2D)             (None, None, None, 1 258048      activation_160[0][0]             
__________________________________________________________________________________________________
batch_normalization_156 (BatchN (None, None, None, 1 576         conv2d_156[0][0]                 
__________________________________________________________________________________________________
batch_normalization_161 (BatchN (None, None, None, 1 576         conv2d_161[0][0]                 
__________________________________________________________________________________________________
activation_156 (Activation)     (None, None, None, 1 0           batch_normalization_156[0][0]    
__________________________________________________________________________________________________
activation_161 (Activation)     (None, None, None, 1 0           batch_normalization_161[0][0]    
__________________________________________________________________________________________________
average_pooling2d_15 (AveragePo (None, None, None, 7 0           mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_154 (Conv2D)             (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_157 (Conv2D)             (None, None, None, 1 258048      activation_156[0][0]             
__________________________________________________________________________________________________
conv2d_162 (Conv2D)             (None, None, None, 1 258048      activation_161[0][0]             
__________________________________________________________________________________________________
conv2d_163 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_15[0][0]       
__________________________________________________________________________________________________
batch_normalization_154 (BatchN (None, None, None, 1 576         conv2d_154[0][0]                 
__________________________________________________________________________________________________
batch_normalization_157 (BatchN (None, None, None, 1 576         conv2d_157[0][0]                 
__________________________________________________________________________________________________
batch_normalization_162 (BatchN (None, None, None, 1 576         conv2d_162[0][0]                 
__________________________________________________________________________________________________
batch_normalization_163 (BatchN (None, None, None, 1 576         conv2d_163[0][0]                 
__________________________________________________________________________________________________
activation_154 (Activation)     (None, None, None, 1 0           batch_normalization_154[0][0]    
__________________________________________________________________________________________________
activation_157 (Activation)     (None, None, None, 1 0           batch_normalization_157[0][0]    
__________________________________________________________________________________________________
activation_162 (Activation)     (None, None, None, 1 0           batch_normalization_162[0][0]    
__________________________________________________________________________________________________
activation_163 (Activation)     (None, None, None, 1 0           batch_normalization_163[0][0]    
__________________________________________________________________________________________________
mixed7 (Concatenate)            (None, None, None, 7 0           activation_154[0][0]             
                                                                 activation_157[0][0]             
                                                                 activation_162[0][0]             
                                                                 activation_163[0][0]             
__________________________________________________________________________________________________
conv2d_166 (Conv2D)             (None, None, None, 1 147456      mixed7[0][0]                     
__________________________________________________________________________________________________
batch_normalization_166 (BatchN (None, None, None, 1 576         conv2d_166[0][0]                 
__________________________________________________________________________________________________
activation_166 (Activation)     (None, None, None, 1 0           batch_normalization_166[0][0]    
__________________________________________________________________________________________________
conv2d_167 (Conv2D)             (None, None, None, 1 258048      activation_166[0][0]             
__________________________________________________________________________________________________
batch_normalization_167 (BatchN (None, None, None, 1 576         conv2d_167[0][0]                 
__________________________________________________________________________________________________
activation_167 (Activation)     (None, None, None, 1 0           batch_normalization_167[0][0]    
__________________________________________________________________________________________________
conv2d_164 (Conv2D)             (None, None, None, 1 147456      mixed7[0][0]                     
__________________________________________________________________________________________________
conv2d_168 (Conv2D)             (None, None, None, 1 258048      activation_167[0][0]             
__________________________________________________________________________________________________
batch_normalization_164 (BatchN (None, None, None, 1 576         conv2d_164[0][0]                 
__________________________________________________________________________________________________
batch_normalization_168 (BatchN (None, None, None, 1 576         conv2d_168[0][0]                 
__________________________________________________________________________________________________
activation_164 (Activation)     (None, None, None, 1 0           batch_normalization_164[0][0]    
__________________________________________________________________________________________________
activation_168 (Activation)     (None, None, None, 1 0           batch_normalization_168[0][0]    
__________________________________________________________________________________________________
conv2d_165 (Conv2D)             (None, None, None, 3 552960      activation_164[0][0]             
__________________________________________________________________________________________________
conv2d_169 (Conv2D)             (None, None, None, 1 331776      activation_168[0][0]             
__________________________________________________________________________________________________
batch_normalization_165 (BatchN (None, None, None, 3 960         conv2d_165[0][0]                 
__________________________________________________________________________________________________
batch_normalization_169 (BatchN (None, None, None, 1 576         conv2d_169[0][0]                 
__________________________________________________________________________________________________
activation_165 (Activation)     (None, None, None, 3 0           batch_normalization_165[0][0]    
__________________________________________________________________________________________________
activation_169 (Activation)     (None, None, None, 1 0           batch_normalization_169[0][0]    
__________________________________________________________________________________________________
max_pooling2d_7 (MaxPooling2D)  (None, None, None, 7 0           mixed7[0][0]                     
__________________________________________________________________________________________________
mixed8 (Concatenate)            (None, None, None, 1 0           activation_165[0][0]             
                                                                 activation_169[0][0]             
                                                                 max_pooling2d_7[0][0]            
__________________________________________________________________________________________________
conv2d_174 (Conv2D)             (None, None, None, 4 573440      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_174 (BatchN (None, None, None, 4 1344        conv2d_174[0][0]                 
__________________________________________________________________________________________________
activation_174 (Activation)     (None, None, None, 4 0           batch_normalization_174[0][0]    
__________________________________________________________________________________________________
conv2d_171 (Conv2D)             (None, None, None, 3 491520      mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_175 (Conv2D)             (None, None, None, 3 1548288     activation_174[0][0]             
__________________________________________________________________________________________________
batch_normalization_171 (BatchN (None, None, None, 3 1152        conv2d_171[0][0]                 
__________________________________________________________________________________________________
batch_normalization_175 (BatchN (None, None, None, 3 1152        conv2d_175[0][0]                 
__________________________________________________________________________________________________
activation_171 (Activation)     (None, None, None, 3 0           batch_normalization_171[0][0]    
__________________________________________________________________________________________________
activation_175 (Activation)     (None, None, None, 3 0           batch_normalization_175[0][0]    
__________________________________________________________________________________________________
conv2d_172 (Conv2D)             (None, None, None, 3 442368      activation_171[0][0]             
__________________________________________________________________________________________________
conv2d_173 (Conv2D)             (None, None, None, 3 442368      activation_171[0][0]             
__________________________________________________________________________________________________
conv2d_176 (Conv2D)             (None, None, None, 3 442368      activation_175[0][0]             
__________________________________________________________________________________________________
conv2d_177 (Conv2D)             (None, None, None, 3 442368      activation_175[0][0]             
__________________________________________________________________________________________________
average_pooling2d_16 (AveragePo (None, None, None, 1 0           mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_170 (Conv2D)             (None, None, None, 3 409600      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_172 (BatchN (None, None, None, 3 1152        conv2d_172[0][0]                 
__________________________________________________________________________________________________
batch_normalization_173 (BatchN (None, None, None, 3 1152        conv2d_173[0][0]                 
__________________________________________________________________________________________________
batch_normalization_176 (BatchN (None, None, None, 3 1152        conv2d_176[0][0]                 
__________________________________________________________________________________________________
batch_normalization_177 (BatchN (None, None, None, 3 1152        conv2d_177[0][0]                 
__________________________________________________________________________________________________
conv2d_178 (Conv2D)             (None, None, None, 1 245760      average_pooling2d_16[0][0]       
__________________________________________________________________________________________________
batch_normalization_170 (BatchN (None, None, None, 3 960         conv2d_170[0][0]                 
__________________________________________________________________________________________________
activation_172 (Activation)     (None, None, None, 3 0           batch_normalization_172[0][0]    
__________________________________________________________________________________________________
activation_173 (Activation)     (None, None, None, 3 0           batch_normalization_173[0][0]    
__________________________________________________________________________________________________
activation_176 (Activation)     (None, None, None, 3 0           batch_normalization_176[0][0]    
__________________________________________________________________________________________________
activation_177 (Activation)     (None, None, None, 3 0           batch_normalization_177[0][0]    
__________________________________________________________________________________________________
batch_normalization_178 (BatchN (None, None, None, 1 576         conv2d_178[0][0]                 
__________________________________________________________________________________________________
activation_170 (Activation)     (None, None, None, 3 0           batch_normalization_170[0][0]    
__________________________________________________________________________________________________
mixed9_0 (Concatenate)          (None, None, None, 7 0           activation_172[0][0]             
                                                                 activation_173[0][0]             
__________________________________________________________________________________________________
concatenate_2 (Concatenate)     (None, None, None, 7 0           activation_176[0][0]             
                                                                 activation_177[0][0]             
__________________________________________________________________________________________________
activation_178 (Activation)     (None, None, None, 1 0           batch_normalization_178[0][0]    
__________________________________________________________________________________________________
mixed9 (Concatenate)            (None, None, None, 2 0           activation_170[0][0]             
                                                                 mixed9_0[0][0]                   
                                                                 concatenate_2[0][0]              
                                                                 activation_178[0][0]             
__________________________________________________________________________________________________
conv2d_183 (Conv2D)             (None, None, None, 4 917504      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_183 (BatchN (None, None, None, 4 1344        conv2d_183[0][0]                 
__________________________________________________________________________________________________
activation_183 (Activation)     (None, None, None, 4 0           batch_normalization_183[0][0]    
__________________________________________________________________________________________________
conv2d_180 (Conv2D)             (None, None, None, 3 786432      mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_184 (Conv2D)             (None, None, None, 3 1548288     activation_183[0][0]             
__________________________________________________________________________________________________
batch_normalization_180 (BatchN (None, None, None, 3 1152        conv2d_180[0][0]                 
__________________________________________________________________________________________________
batch_normalization_184 (BatchN (None, None, None, 3 1152        conv2d_184[0][0]                 
__________________________________________________________________________________________________
activation_180 (Activation)     (None, None, None, 3 0           batch_normalization_180[0][0]    
__________________________________________________________________________________________________
activation_184 (Activation)     (None, None, None, 3 0           batch_normalization_184[0][0]    
__________________________________________________________________________________________________
conv2d_181 (Conv2D)             (None, None, None, 3 442368      activation_180[0][0]             
__________________________________________________________________________________________________
conv2d_182 (Conv2D)             (None, None, None, 3 442368      activation_180[0][0]             
__________________________________________________________________________________________________
conv2d_185 (Conv2D)             (None, None, None, 3 442368      activation_184[0][0]             
__________________________________________________________________________________________________
conv2d_186 (Conv2D)             (None, None, None, 3 442368      activation_184[0][0]             
__________________________________________________________________________________________________
average_pooling2d_17 (AveragePo (None, None, None, 2 0           mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_179 (Conv2D)             (None, None, None, 3 655360      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_181 (BatchN (None, None, None, 3 1152        conv2d_181[0][0]                 
__________________________________________________________________________________________________
batch_normalization_182 (BatchN (None, None, None, 3 1152        conv2d_182[0][0]                 
__________________________________________________________________________________________________
batch_normalization_185 (BatchN (None, None, None, 3 1152        conv2d_185[0][0]                 
__________________________________________________________________________________________________
batch_normalization_186 (BatchN (None, None, None, 3 1152        conv2d_186[0][0]                 
__________________________________________________________________________________________________
conv2d_187 (Conv2D)             (None, None, None, 1 393216      average_pooling2d_17[0][0]       
__________________________________________________________________________________________________
batch_normalization_179 (BatchN (None, None, None, 3 960         conv2d_179[0][0]                 
__________________________________________________________________________________________________
activation_181 (Activation)     (None, None, None, 3 0           batch_normalization_181[0][0]    
__________________________________________________________________________________________________
activation_182 (Activation)     (None, None, None, 3 0           batch_normalization_182[0][0]    
__________________________________________________________________________________________________
activation_185 (Activation)     (None, None, None, 3 0           batch_normalization_185[0][0]    
__________________________________________________________________________________________________
activation_186 (Activation)     (None, None, None, 3 0           batch_normalization_186[0][0]    
__________________________________________________________________________________________________
batch_normalization_187 (BatchN (None, None, None, 1 576         conv2d_187[0][0]                 
__________________________________________________________________________________________________
activation_179 (Activation)     (None, None, None, 3 0           batch_normalization_179[0][0]    
__________________________________________________________________________________________________
mixed9_1 (Concatenate)          (None, None, None, 7 0           activation_181[0][0]             
                                                                 activation_182[0][0]             
__________________________________________________________________________________________________
concatenate_3 (Concatenate)     (None, None, None, 7 0           activation_185[0][0]             
                                                                 activation_186[0][0]             
__________________________________________________________________________________________________
activation_187 (Activation)     (None, None, None, 1 0           batch_normalization_187[0][0]    
__________________________________________________________________________________________________
mixed10 (Concatenate)           (None, None, None, 2 0           activation_179[0][0]             
                                                                 mixed9_1[0][0]                   
                                                                 concatenate_3[0][0]              
                                                                 activation_187[0][0]             
==================================================================================================
Total params: 21,802,784
Trainable params: 21,768,352
Non-trainable params: 34,432
__________________________________________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
226.20457458496094

I think these models aare really interesting and create a very 'biological' look. I think it suits the Venus Fly trap a lot.

Xception

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.xception.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.Xception(include_top=False, weights='imagenet')
base_model.summary()
Model: "xception"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_8 (InputLayer)            [(None, None, None,  0                                            
__________________________________________________________________________________________________
block1_conv1 (Conv2D)           (None, None, None, 3 864         input_8[0][0]                    
__________________________________________________________________________________________________
block1_conv1_bn (BatchNormaliza (None, None, None, 3 128         block1_conv1[0][0]               
__________________________________________________________________________________________________
block1_conv1_act (Activation)   (None, None, None, 3 0           block1_conv1_bn[0][0]            
__________________________________________________________________________________________________
block1_conv2 (Conv2D)           (None, None, None, 6 18432       block1_conv1_act[0][0]           
__________________________________________________________________________________________________
block1_conv2_bn (BatchNormaliza (None, None, None, 6 256         block1_conv2[0][0]               
__________________________________________________________________________________________________
block1_conv2_act (Activation)   (None, None, None, 6 0           block1_conv2_bn[0][0]            
__________________________________________________________________________________________________
block2_sepconv1 (SeparableConv2 (None, None, None, 1 8768        block1_conv2_act[0][0]           
__________________________________________________________________________________________________
block2_sepconv1_bn (BatchNormal (None, None, None, 1 512         block2_sepconv1[0][0]            
__________________________________________________________________________________________________
block2_sepconv2_act (Activation (None, None, None, 1 0           block2_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block2_sepconv2 (SeparableConv2 (None, None, None, 1 17536       block2_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block2_sepconv2_bn (BatchNormal (None, None, None, 1 512         block2_sepconv2[0][0]            
__________________________________________________________________________________________________
conv2d_298 (Conv2D)             (None, None, None, 1 8192        block1_conv2_act[0][0]           
__________________________________________________________________________________________________
block2_pool (MaxPooling2D)      (None, None, None, 1 0           block2_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
batch_normalization_298 (BatchN (None, None, None, 1 512         conv2d_298[0][0]                 
__________________________________________________________________________________________________
add_48 (Add)                    (None, None, None, 1 0           block2_pool[0][0]                
                                                                 batch_normalization_298[0][0]    
__________________________________________________________________________________________________
block3_sepconv1_act (Activation (None, None, None, 1 0           add_48[0][0]                     
__________________________________________________________________________________________________
block3_sepconv1 (SeparableConv2 (None, None, None, 2 33920       block3_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block3_sepconv1_bn (BatchNormal (None, None, None, 2 1024        block3_sepconv1[0][0]            
__________________________________________________________________________________________________
block3_sepconv2_act (Activation (None, None, None, 2 0           block3_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block3_sepconv2 (SeparableConv2 (None, None, None, 2 67840       block3_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block3_sepconv2_bn (BatchNormal (None, None, None, 2 1024        block3_sepconv2[0][0]            
__________________________________________________________________________________________________
conv2d_299 (Conv2D)             (None, None, None, 2 32768       add_48[0][0]                     
__________________________________________________________________________________________________
block3_pool (MaxPooling2D)      (None, None, None, 2 0           block3_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
batch_normalization_299 (BatchN (None, None, None, 2 1024        conv2d_299[0][0]                 
__________________________________________________________________________________________________
add_49 (Add)                    (None, None, None, 2 0           block3_pool[0][0]                
                                                                 batch_normalization_299[0][0]    
__________________________________________________________________________________________________
block4_sepconv1_act (Activation (None, None, None, 2 0           add_49[0][0]                     
__________________________________________________________________________________________________
block4_sepconv1 (SeparableConv2 (None, None, None, 7 188672      block4_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block4_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block4_sepconv1[0][0]            
__________________________________________________________________________________________________
block4_sepconv2_act (Activation (None, None, None, 7 0           block4_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block4_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block4_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block4_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block4_sepconv2[0][0]            
__________________________________________________________________________________________________
conv2d_300 (Conv2D)             (None, None, None, 7 186368      add_49[0][0]                     
__________________________________________________________________________________________________
block4_pool (MaxPooling2D)      (None, None, None, 7 0           block4_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
batch_normalization_300 (BatchN (None, None, None, 7 2912        conv2d_300[0][0]                 
__________________________________________________________________________________________________
add_50 (Add)                    (None, None, None, 7 0           block4_pool[0][0]                
                                                                 batch_normalization_300[0][0]    
__________________________________________________________________________________________________
block5_sepconv1_act (Activation (None, None, None, 7 0           add_50[0][0]                     
__________________________________________________________________________________________________
block5_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block5_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block5_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block5_sepconv1[0][0]            
__________________________________________________________________________________________________
block5_sepconv2_act (Activation (None, None, None, 7 0           block5_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block5_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block5_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block5_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block5_sepconv2[0][0]            
__________________________________________________________________________________________________
block5_sepconv3_act (Activation (None, None, None, 7 0           block5_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block5_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block5_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block5_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block5_sepconv3[0][0]            
__________________________________________________________________________________________________
add_51 (Add)                    (None, None, None, 7 0           block5_sepconv3_bn[0][0]         
                                                                 add_50[0][0]                     
__________________________________________________________________________________________________
block6_sepconv1_act (Activation (None, None, None, 7 0           add_51[0][0]                     
__________________________________________________________________________________________________
block6_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block6_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block6_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block6_sepconv1[0][0]            
__________________________________________________________________________________________________
block6_sepconv2_act (Activation (None, None, None, 7 0           block6_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block6_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block6_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block6_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block6_sepconv2[0][0]            
__________________________________________________________________________________________________
block6_sepconv3_act (Activation (None, None, None, 7 0           block6_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block6_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block6_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block6_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block6_sepconv3[0][0]            
__________________________________________________________________________________________________
add_52 (Add)                    (None, None, None, 7 0           block6_sepconv3_bn[0][0]         
                                                                 add_51[0][0]                     
__________________________________________________________________________________________________
block7_sepconv1_act (Activation (None, None, None, 7 0           add_52[0][0]                     
__________________________________________________________________________________________________
block7_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block7_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block7_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block7_sepconv1[0][0]            
__________________________________________________________________________________________________
block7_sepconv2_act (Activation (None, None, None, 7 0           block7_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block7_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block7_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block7_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block7_sepconv2[0][0]            
__________________________________________________________________________________________________
block7_sepconv3_act (Activation (None, None, None, 7 0           block7_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block7_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block7_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block7_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block7_sepconv3[0][0]            
__________________________________________________________________________________________________
add_53 (Add)                    (None, None, None, 7 0           block7_sepconv3_bn[0][0]         
                                                                 add_52[0][0]                     
__________________________________________________________________________________________________
block8_sepconv1_act (Activation (None, None, None, 7 0           add_53[0][0]                     
__________________________________________________________________________________________________
block8_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block8_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block8_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block8_sepconv1[0][0]            
__________________________________________________________________________________________________
block8_sepconv2_act (Activation (None, None, None, 7 0           block8_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block8_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block8_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block8_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block8_sepconv2[0][0]            
__________________________________________________________________________________________________
block8_sepconv3_act (Activation (None, None, None, 7 0           block8_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block8_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block8_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block8_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block8_sepconv3[0][0]            
__________________________________________________________________________________________________
add_54 (Add)                    (None, None, None, 7 0           block8_sepconv3_bn[0][0]         
                                                                 add_53[0][0]                     
__________________________________________________________________________________________________
block9_sepconv1_act (Activation (None, None, None, 7 0           add_54[0][0]                     
__________________________________________________________________________________________________
block9_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block9_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block9_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block9_sepconv1[0][0]            
__________________________________________________________________________________________________
block9_sepconv2_act (Activation (None, None, None, 7 0           block9_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block9_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block9_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block9_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block9_sepconv2[0][0]            
__________________________________________________________________________________________________
block9_sepconv3_act (Activation (None, None, None, 7 0           block9_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block9_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block9_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block9_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block9_sepconv3[0][0]            
__________________________________________________________________________________________________
add_55 (Add)                    (None, None, None, 7 0           block9_sepconv3_bn[0][0]         
                                                                 add_54[0][0]                     
__________________________________________________________________________________________________
block10_sepconv1_act (Activatio (None, None, None, 7 0           add_55[0][0]                     
__________________________________________________________________________________________________
block10_sepconv1 (SeparableConv (None, None, None, 7 536536      block10_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block10_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block10_sepconv1[0][0]           
__________________________________________________________________________________________________
block10_sepconv2_act (Activatio (None, None, None, 7 0           block10_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block10_sepconv2 (SeparableConv (None, None, None, 7 536536      block10_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block10_sepconv2_bn (BatchNorma (None, None, None, 7 2912        block10_sepconv2[0][0]           
__________________________________________________________________________________________________
block10_sepconv3_act (Activatio (None, None, None, 7 0           block10_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
block10_sepconv3 (SeparableConv (None, None, None, 7 536536      block10_sepconv3_act[0][0]       
__________________________________________________________________________________________________
block10_sepconv3_bn (BatchNorma (None, None, None, 7 2912        block10_sepconv3[0][0]           
__________________________________________________________________________________________________
add_56 (Add)                    (None, None, None, 7 0           block10_sepconv3_bn[0][0]        
                                                                 add_55[0][0]                     
__________________________________________________________________________________________________
block11_sepconv1_act (Activatio (None, None, None, 7 0           add_56[0][0]                     
__________________________________________________________________________________________________
block11_sepconv1 (SeparableConv (None, None, None, 7 536536      block11_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block11_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block11_sepconv1[0][0]           
__________________________________________________________________________________________________
block11_sepconv2_act (Activatio (None, None, None, 7 0           block11_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block11_sepconv2 (SeparableConv (None, None, None, 7 536536      block11_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block11_sepconv2_bn (BatchNorma (None, None, None, 7 2912        block11_sepconv2[0][0]           
__________________________________________________________________________________________________
block11_sepconv3_act (Activatio (None, None, None, 7 0           block11_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
block11_sepconv3 (SeparableConv (None, None, None, 7 536536      block11_sepconv3_act[0][0]       
__________________________________________________________________________________________________
block11_sepconv3_bn (BatchNorma (None, None, None, 7 2912        block11_sepconv3[0][0]           
__________________________________________________________________________________________________
add_57 (Add)                    (None, None, None, 7 0           block11_sepconv3_bn[0][0]        
                                                                 add_56[0][0]                     
__________________________________________________________________________________________________
block12_sepconv1_act (Activatio (None, None, None, 7 0           add_57[0][0]                     
__________________________________________________________________________________________________
block12_sepconv1 (SeparableConv (None, None, None, 7 536536      block12_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block12_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block12_sepconv1[0][0]           
__________________________________________________________________________________________________
block12_sepconv2_act (Activatio (None, None, None, 7 0           block12_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block12_sepconv2 (SeparableConv (None, None, None, 7 536536      block12_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block12_sepconv2_bn (BatchNorma (None, None, None, 7 2912        block12_sepconv2[0][0]           
__________________________________________________________________________________________________
block12_sepconv3_act (Activatio (None, None, None, 7 0           block12_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
block12_sepconv3 (SeparableConv (None, None, None, 7 536536      block12_sepconv3_act[0][0]       
__________________________________________________________________________________________________
block12_sepconv3_bn (BatchNorma (None, None, None, 7 2912        block12_sepconv3[0][0]           
__________________________________________________________________________________________________
add_58 (Add)                    (None, None, None, 7 0           block12_sepconv3_bn[0][0]        
                                                                 add_57[0][0]                     
__________________________________________________________________________________________________
block13_sepconv1_act (Activatio (None, None, None, 7 0           add_58[0][0]                     
__________________________________________________________________________________________________
block13_sepconv1 (SeparableConv (None, None, None, 7 536536      block13_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block13_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block13_sepconv1[0][0]           
__________________________________________________________________________________________________
block13_sepconv2_act (Activatio (None, None, None, 7 0           block13_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block13_sepconv2 (SeparableConv (None, None, None, 1 752024      block13_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block13_sepconv2_bn (BatchNorma (None, None, None, 1 4096        block13_sepconv2[0][0]           
__________________________________________________________________________________________________
conv2d_301 (Conv2D)             (None, None, None, 1 745472      add_58[0][0]                     
__________________________________________________________________________________________________
block13_pool (MaxPooling2D)     (None, None, None, 1 0           block13_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
batch_normalization_301 (BatchN (None, None, None, 1 4096        conv2d_301[0][0]                 
__________________________________________________________________________________________________
add_59 (Add)                    (None, None, None, 1 0           block13_pool[0][0]               
                                                                 batch_normalization_301[0][0]    
__________________________________________________________________________________________________
block14_sepconv1 (SeparableConv (None, None, None, 1 1582080     add_59[0][0]                     
__________________________________________________________________________________________________
block14_sepconv1_bn (BatchNorma (None, None, None, 1 6144        block14_sepconv1[0][0]           
__________________________________________________________________________________________________
block14_sepconv1_act (Activatio (None, None, None, 1 0           block14_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block14_sepconv2 (SeparableConv (None, None, None, 2 3159552     block14_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block14_sepconv2_bn (BatchNorma (None, None, None, 2 8192        block14_sepconv2[0][0]           
__________________________________________________________________________________________________
block14_sepconv2_act (Activatio (None, None, None, 2 0           block14_sepconv2_bn[0][0]        
==================================================================================================
Total params: 20,861,480
Trainable params: 20,806,952
Non-trainable params: 54,528
__________________________________________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['block1_conv1']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
3.146927833557129

To me Xception, at least at a lower value, really makes the images look retro, which works well for the city and the paitning begins to look like it is underwater or in space.

MobileNet

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')
base_model.summary()
WARNING:tensorflow:`input_shape` is undefined or non-square, or `rows` is not in [128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default.
Model: "mobilenet_1.00_224"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_13 (InputLayer)        [(None, None, None, 3)]   0         
_________________________________________________________________
conv1 (Conv2D)               (None, None, None, 32)    864       
_________________________________________________________________
conv1_bn (BatchNormalization (None, None, None, 32)    128       
_________________________________________________________________
conv1_relu (ReLU)            (None, None, None, 32)    0         
_________________________________________________________________
conv_dw_1 (DepthwiseConv2D)  (None, None, None, 32)    288       
_________________________________________________________________
conv_dw_1_bn (BatchNormaliza (None, None, None, 32)    128       
_________________________________________________________________
conv_dw_1_relu (ReLU)        (None, None, None, 32)    0         
_________________________________________________________________
conv_pw_1 (Conv2D)           (None, None, None, 64)    2048      
_________________________________________________________________
conv_pw_1_bn (BatchNormaliza (None, None, None, 64)    256       
_________________________________________________________________
conv_pw_1_relu (ReLU)        (None, None, None, 64)    0         
_________________________________________________________________
conv_pad_2 (ZeroPadding2D)   (None, None, None, 64)    0         
_________________________________________________________________
conv_dw_2 (DepthwiseConv2D)  (None, None, None, 64)    576       
_________________________________________________________________
conv_dw_2_bn (BatchNormaliza (None, None, None, 64)    256       
_________________________________________________________________
conv_dw_2_relu (ReLU)        (None, None, None, 64)    0         
_________________________________________________________________
conv_pw_2 (Conv2D)           (None, None, None, 128)   8192      
_________________________________________________________________
conv_pw_2_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_pw_2_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_dw_3 (DepthwiseConv2D)  (None, None, None, 128)   1152      
_________________________________________________________________
conv_dw_3_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_dw_3_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_pw_3 (Conv2D)           (None, None, None, 128)   16384     
_________________________________________________________________
conv_pw_3_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_pw_3_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_pad_4 (ZeroPadding2D)   (None, None, None, 128)   0         
_________________________________________________________________
conv_dw_4 (DepthwiseConv2D)  (None, None, None, 128)   1152      
_________________________________________________________________
conv_dw_4_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_dw_4_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_pw_4 (Conv2D)           (None, None, None, 256)   32768     
_________________________________________________________________
conv_pw_4_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_pw_4_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_dw_5 (DepthwiseConv2D)  (None, None, None, 256)   2304      
_________________________________________________________________
conv_dw_5_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_dw_5_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_pw_5 (Conv2D)           (None, None, None, 256)   65536     
_________________________________________________________________
conv_pw_5_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_pw_5_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_pad_6 (ZeroPadding2D)   (None, None, None, 256)   0         
_________________________________________________________________
conv_dw_6 (DepthwiseConv2D)  (None, None, None, 256)   2304      
_________________________________________________________________
conv_dw_6_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_dw_6_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_pw_6 (Conv2D)           (None, None, None, 512)   131072    
_________________________________________________________________
conv_pw_6_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_6_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_7 (DepthwiseConv2D)  (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_7_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_7_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_7 (Conv2D)           (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_7_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_7_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_8 (DepthwiseConv2D)  (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_8_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_8_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_8 (Conv2D)           (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_8_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_8_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_9 (DepthwiseConv2D)  (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_9_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_9_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_9 (Conv2D)           (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_9_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_9_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_10 (DepthwiseConv2D) (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_10_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_10_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_10 (Conv2D)          (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_10_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_10_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_11 (DepthwiseConv2D) (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_11_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_11_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_11 (Conv2D)          (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_11_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_11_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pad_12 (ZeroPadding2D)  (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_12 (DepthwiseConv2D) (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_12_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_12_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_12 (Conv2D)          (None, None, None, 1024)  524288    
_________________________________________________________________
conv_pw_12_bn (BatchNormaliz (None, None, None, 1024)  4096      
_________________________________________________________________
conv_pw_12_relu (ReLU)       (None, None, None, 1024)  0         
_________________________________________________________________
conv_dw_13 (DepthwiseConv2D) (None, None, None, 1024)  9216      
_________________________________________________________________
conv_dw_13_bn (BatchNormaliz (None, None, None, 1024)  4096      
_________________________________________________________________
conv_dw_13_relu (ReLU)       (None, None, None, 1024)  0         
_________________________________________________________________
conv_pw_13 (Conv2D)          (None, None, None, 1024)  1048576   
_________________________________________________________________
conv_pw_13_bn (BatchNormaliz (None, None, None, 1024)  4096      
_________________________________________________________________
conv_pw_13_relu (ReLU)       (None, None, None, 1024)  0         
=================================================================
Total params: 3,228,864
Trainable params: 3,206,976
Non-trainable params: 21,888
_________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['conv_pw_9', 'conv_pw_8']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
109.73373031616211

This algorithm is one of my favourites. The products remind me of cubism and it almost transforms the last fine art piece from classical to cubism, which would be a very interesting use for DeepDream.

DenseNet121

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.densenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.DenseNet121(include_top=False, weights='imagenet')
base_model.summary()
Model: "densenet121"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_17 (InputLayer)           [(None, None, None,  0                                            
__________________________________________________________________________________________________
zero_padding2d_6 (ZeroPadding2D (None, None, None, 3 0           input_17[0][0]                   
__________________________________________________________________________________________________
conv1/conv (Conv2D)             (None, None, None, 6 9408        zero_padding2d_6[0][0]           
__________________________________________________________________________________________________
conv1/bn (BatchNormalization)   (None, None, None, 6 256         conv1/conv[0][0]                 
__________________________________________________________________________________________________
conv1/relu (Activation)         (None, None, None, 6 0           conv1/bn[0][0]                   
__________________________________________________________________________________________________
zero_padding2d_7 (ZeroPadding2D (None, None, None, 6 0           conv1/relu[0][0]                 
__________________________________________________________________________________________________
pool1 (MaxPooling2D)            (None, None, None, 6 0           zero_padding2d_7[0][0]           
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, None, None, 6 256         pool1[0][0]                      
__________________________________________________________________________________________________
conv2_block1_0_relu (Activation (None, None, None, 6 0           conv2_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D)    (None, None, None, 1 8192        conv2_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, None, None, 1 512         conv2_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, None, None, 1 0           conv2_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D)    (None, None, None, 3 36864       conv2_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_concat (Concatenat (None, None, None, 9 0           pool1[0][0]                      
                                                                 conv2_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_0_bn (BatchNormali (None, None, None, 9 384         conv2_block1_concat[0][0]        
__________________________________________________________________________________________________
conv2_block2_0_relu (Activation (None, None, None, 9 0           conv2_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D)    (None, None, None, 1 12288       conv2_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, None, None, 1 512         conv2_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, None, None, 1 0           conv2_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D)    (None, None, None, 3 36864       conv2_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_concat (Concatenat (None, None, None, 1 0           conv2_block1_concat[0][0]        
                                                                 conv2_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_0_bn (BatchNormali (None, None, None, 1 512         conv2_block2_concat[0][0]        
__________________________________________________________________________________________________
conv2_block3_0_relu (Activation (None, None, None, 1 0           conv2_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D)    (None, None, None, 1 16384       conv2_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, None, None, 1 512         conv2_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, None, None, 1 0           conv2_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D)    (None, None, None, 3 36864       conv2_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_concat (Concatenat (None, None, None, 1 0           conv2_block2_concat[0][0]        
                                                                 conv2_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block4_0_bn (BatchNormali (None, None, None, 1 640         conv2_block3_concat[0][0]        
__________________________________________________________________________________________________
conv2_block4_0_relu (Activation (None, None, None, 1 0           conv2_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block4_1_conv (Conv2D)    (None, None, None, 1 20480       conv2_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block4_1_bn (BatchNormali (None, None, None, 1 512         conv2_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block4_1_relu (Activation (None, None, None, 1 0           conv2_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block4_2_conv (Conv2D)    (None, None, None, 3 36864       conv2_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block4_concat (Concatenat (None, None, None, 1 0           conv2_block3_concat[0][0]        
                                                                 conv2_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block5_0_bn (BatchNormali (None, None, None, 1 768         conv2_block4_concat[0][0]        
__________________________________________________________________________________________________
conv2_block5_0_relu (Activation (None, None, None, 1 0           conv2_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block5_1_conv (Conv2D)    (None, None, None, 1 24576       conv2_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block5_1_bn (BatchNormali (None, None, None, 1 512         conv2_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block5_1_relu (Activation (None, None, None, 1 0           conv2_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block5_2_conv (Conv2D)    (None, None, None, 3 36864       conv2_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block5_concat (Concatenat (None, None, None, 2 0           conv2_block4_concat[0][0]        
                                                                 conv2_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block6_0_bn (BatchNormali (None, None, None, 2 896         conv2_block5_concat[0][0]        
__________________________________________________________________________________________________
conv2_block6_0_relu (Activation (None, None, None, 2 0           conv2_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv2_block6_1_conv (Conv2D)    (None, None, None, 1 28672       conv2_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv2_block6_1_bn (BatchNormali (None, None, None, 1 512         conv2_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block6_1_relu (Activation (None, None, None, 1 0           conv2_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block6_2_conv (Conv2D)    (None, None, None, 3 36864       conv2_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block6_concat (Concatenat (None, None, None, 2 0           conv2_block5_concat[0][0]        
                                                                 conv2_block6_2_conv[0][0]        
__________________________________________________________________________________________________
pool2_bn (BatchNormalization)   (None, None, None, 2 1024        conv2_block6_concat[0][0]        
__________________________________________________________________________________________________
pool2_relu (Activation)         (None, None, None, 2 0           pool2_bn[0][0]                   
__________________________________________________________________________________________________
pool2_conv (Conv2D)             (None, None, None, 1 32768       pool2_relu[0][0]                 
__________________________________________________________________________________________________
pool2_pool (AveragePooling2D)   (None, None, None, 1 0           pool2_conv[0][0]                 
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, None, None, 1 512         pool2_pool[0][0]                 
__________________________________________________________________________________________________
conv3_block1_0_relu (Activation (None, None, None, 1 0           conv3_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D)    (None, None, None, 1 16384       conv3_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, None, None, 1 512         conv3_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, None, None, 1 0           conv3_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_concat (Concatenat (None, None, None, 1 0           pool2_pool[0][0]                 
                                                                 conv3_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_0_bn (BatchNormali (None, None, None, 1 640         conv3_block1_concat[0][0]        
__________________________________________________________________________________________________
conv3_block2_0_relu (Activation (None, None, None, 1 0           conv3_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D)    (None, None, None, 1 20480       conv3_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, None, None, 1 512         conv3_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, None, None, 1 0           conv3_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_concat (Concatenat (None, None, None, 1 0           conv3_block1_concat[0][0]        
                                                                 conv3_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_0_bn (BatchNormali (None, None, None, 1 768         conv3_block2_concat[0][0]        
__________________________________________________________________________________________________
conv3_block3_0_relu (Activation (None, None, None, 1 0           conv3_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D)    (None, None, None, 1 24576       conv3_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, None, None, 1 512         conv3_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, None, None, 1 0           conv3_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_concat (Concatenat (None, None, None, 2 0           conv3_block2_concat[0][0]        
                                                                 conv3_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_0_bn (BatchNormali (None, None, None, 2 896         conv3_block3_concat[0][0]        
__________________________________________________________________________________________________
conv3_block4_0_relu (Activation (None, None, None, 2 0           conv3_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D)    (None, None, None, 1 28672       conv3_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, None, None, 1 512         conv3_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, None, None, 1 0           conv3_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_concat (Concatenat (None, None, None, 2 0           conv3_block3_concat[0][0]        
                                                                 conv3_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block5_0_bn (BatchNormali (None, None, None, 2 1024        conv3_block4_concat[0][0]        
__________________________________________________________________________________________________
conv3_block5_0_relu (Activation (None, None, None, 2 0           conv3_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block5_1_conv (Conv2D)    (None, None, None, 1 32768       conv3_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block5_1_bn (BatchNormali (None, None, None, 1 512         conv3_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block5_1_relu (Activation (None, None, None, 1 0           conv3_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block5_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block5_concat (Concatenat (None, None, None, 2 0           conv3_block4_concat[0][0]        
                                                                 conv3_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block6_0_bn (BatchNormali (None, None, None, 2 1152        conv3_block5_concat[0][0]        
__________________________________________________________________________________________________
conv3_block6_0_relu (Activation (None, None, None, 2 0           conv3_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block6_1_conv (Conv2D)    (None, None, None, 1 36864       conv3_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block6_1_bn (BatchNormali (None, None, None, 1 512         conv3_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block6_1_relu (Activation (None, None, None, 1 0           conv3_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block6_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block6_concat (Concatenat (None, None, None, 3 0           conv3_block5_concat[0][0]        
                                                                 conv3_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block7_0_bn (BatchNormali (None, None, None, 3 1280        conv3_block6_concat[0][0]        
__________________________________________________________________________________________________
conv3_block7_0_relu (Activation (None, None, None, 3 0           conv3_block7_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block7_1_conv (Conv2D)    (None, None, None, 1 40960       conv3_block7_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block7_1_bn (BatchNormali (None, None, None, 1 512         conv3_block7_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block7_1_relu (Activation (None, None, None, 1 0           conv3_block7_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block7_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block7_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block7_concat (Concatenat (None, None, None, 3 0           conv3_block6_concat[0][0]        
                                                                 conv3_block7_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block8_0_bn (BatchNormali (None, None, None, 3 1408        conv3_block7_concat[0][0]        
__________________________________________________________________________________________________
conv3_block8_0_relu (Activation (None, None, None, 3 0           conv3_block8_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block8_1_conv (Conv2D)    (None, None, None, 1 45056       conv3_block8_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block8_1_bn (BatchNormali (None, None, None, 1 512         conv3_block8_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block8_1_relu (Activation (None, None, None, 1 0           conv3_block8_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block8_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block8_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block8_concat (Concatenat (None, None, None, 3 0           conv3_block7_concat[0][0]        
                                                                 conv3_block8_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block9_0_bn (BatchNormali (None, None, None, 3 1536        conv3_block8_concat[0][0]        
__________________________________________________________________________________________________
conv3_block9_0_relu (Activation (None, None, None, 3 0           conv3_block9_0_bn[0][0]          
__________________________________________________________________________________________________
conv3_block9_1_conv (Conv2D)    (None, None, None, 1 49152       conv3_block9_0_relu[0][0]        
__________________________________________________________________________________________________
conv3_block9_1_bn (BatchNormali (None, None, None, 1 512         conv3_block9_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block9_1_relu (Activation (None, None, None, 1 0           conv3_block9_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block9_2_conv (Conv2D)    (None, None, None, 3 36864       conv3_block9_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block9_concat (Concatenat (None, None, None, 4 0           conv3_block8_concat[0][0]        
                                                                 conv3_block9_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block10_0_bn (BatchNormal (None, None, None, 4 1664        conv3_block9_concat[0][0]        
__________________________________________________________________________________________________
conv3_block10_0_relu (Activatio (None, None, None, 4 0           conv3_block10_0_bn[0][0]         
__________________________________________________________________________________________________
conv3_block10_1_conv (Conv2D)   (None, None, None, 1 53248       conv3_block10_0_relu[0][0]       
__________________________________________________________________________________________________
conv3_block10_1_bn (BatchNormal (None, None, None, 1 512         conv3_block10_1_conv[0][0]       
__________________________________________________________________________________________________
conv3_block10_1_relu (Activatio (None, None, None, 1 0           conv3_block10_1_bn[0][0]         
__________________________________________________________________________________________________
conv3_block10_2_conv (Conv2D)   (None, None, None, 3 36864       conv3_block10_1_relu[0][0]       
__________________________________________________________________________________________________
conv3_block10_concat (Concatena (None, None, None, 4 0           conv3_block9_concat[0][0]        
                                                                 conv3_block10_2_conv[0][0]       
__________________________________________________________________________________________________
conv3_block11_0_bn (BatchNormal (None, None, None, 4 1792        conv3_block10_concat[0][0]       
__________________________________________________________________________________________________
conv3_block11_0_relu (Activatio (None, None, None, 4 0           conv3_block11_0_bn[0][0]         
__________________________________________________________________________________________________
conv3_block11_1_conv (Conv2D)   (None, None, None, 1 57344       conv3_block11_0_relu[0][0]       
__________________________________________________________________________________________________
conv3_block11_1_bn (BatchNormal (None, None, None, 1 512         conv3_block11_1_conv[0][0]       
__________________________________________________________________________________________________
conv3_block11_1_relu (Activatio (None, None, None, 1 0           conv3_block11_1_bn[0][0]         
__________________________________________________________________________________________________
conv3_block11_2_conv (Conv2D)   (None, None, None, 3 36864       conv3_block11_1_relu[0][0]       
__________________________________________________________________________________________________
conv3_block11_concat (Concatena (None, None, None, 4 0           conv3_block10_concat[0][0]       
                                                                 conv3_block11_2_conv[0][0]       
__________________________________________________________________________________________________
conv3_block12_0_bn (BatchNormal (None, None, None, 4 1920        conv3_block11_concat[0][0]       
__________________________________________________________________________________________________
conv3_block12_0_relu (Activatio (None, None, None, 4 0           conv3_block12_0_bn[0][0]         
__________________________________________________________________________________________________
conv3_block12_1_conv (Conv2D)   (None, None, None, 1 61440       conv3_block12_0_relu[0][0]       
__________________________________________________________________________________________________
conv3_block12_1_bn (BatchNormal (None, None, None, 1 512         conv3_block12_1_conv[0][0]       
__________________________________________________________________________________________________
conv3_block12_1_relu (Activatio (None, None, None, 1 0           conv3_block12_1_bn[0][0]         
__________________________________________________________________________________________________
conv3_block12_2_conv (Conv2D)   (None, None, None, 3 36864       conv3_block12_1_relu[0][0]       
__________________________________________________________________________________________________
conv3_block12_concat (Concatena (None, None, None, 5 0           conv3_block11_concat[0][0]       
                                                                 conv3_block12_2_conv[0][0]       
__________________________________________________________________________________________________
pool3_bn (BatchNormalization)   (None, None, None, 5 2048        conv3_block12_concat[0][0]       
__________________________________________________________________________________________________
pool3_relu (Activation)         (None, None, None, 5 0           pool3_bn[0][0]                   
__________________________________________________________________________________________________
pool3_conv (Conv2D)             (None, None, None, 2 131072      pool3_relu[0][0]                 
__________________________________________________________________________________________________
pool3_pool (AveragePooling2D)   (None, None, None, 2 0           pool3_conv[0][0]                 
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, None, None, 2 1024        pool3_pool[0][0]                 
__________________________________________________________________________________________________
conv4_block1_0_relu (Activation (None, None, None, 2 0           conv4_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D)    (None, None, None, 1 32768       conv4_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, None, None, 1 512         conv4_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, None, None, 1 0           conv4_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_concat (Concatenat (None, None, None, 2 0           pool3_pool[0][0]                 
                                                                 conv4_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_0_bn (BatchNormali (None, None, None, 2 1152        conv4_block1_concat[0][0]        
__________________________________________________________________________________________________
conv4_block2_0_relu (Activation (None, None, None, 2 0           conv4_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D)    (None, None, None, 1 36864       conv4_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, None, None, 1 512         conv4_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, None, None, 1 0           conv4_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_concat (Concatenat (None, None, None, 3 0           conv4_block1_concat[0][0]        
                                                                 conv4_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_0_bn (BatchNormali (None, None, None, 3 1280        conv4_block2_concat[0][0]        
__________________________________________________________________________________________________
conv4_block3_0_relu (Activation (None, None, None, 3 0           conv4_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D)    (None, None, None, 1 40960       conv4_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, None, None, 1 512         conv4_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, None, None, 1 0           conv4_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_concat (Concatenat (None, None, None, 3 0           conv4_block2_concat[0][0]        
                                                                 conv4_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_0_bn (BatchNormali (None, None, None, 3 1408        conv4_block3_concat[0][0]        
__________________________________________________________________________________________________
conv4_block4_0_relu (Activation (None, None, None, 3 0           conv4_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D)    (None, None, None, 1 45056       conv4_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, None, None, 1 512         conv4_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, None, None, 1 0           conv4_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_concat (Concatenat (None, None, None, 3 0           conv4_block3_concat[0][0]        
                                                                 conv4_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_0_bn (BatchNormali (None, None, None, 3 1536        conv4_block4_concat[0][0]        
__________________________________________________________________________________________________
conv4_block5_0_relu (Activation (None, None, None, 3 0           conv4_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D)    (None, None, None, 1 49152       conv4_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, None, None, 1 512         conv4_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, None, None, 1 0           conv4_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_concat (Concatenat (None, None, None, 4 0           conv4_block4_concat[0][0]        
                                                                 conv4_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_0_bn (BatchNormali (None, None, None, 4 1664        conv4_block5_concat[0][0]        
__________________________________________________________________________________________________
conv4_block6_0_relu (Activation (None, None, None, 4 0           conv4_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D)    (None, None, None, 1 53248       conv4_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, None, None, 1 512         conv4_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, None, None, 1 0           conv4_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_concat (Concatenat (None, None, None, 4 0           conv4_block5_concat[0][0]        
                                                                 conv4_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block7_0_bn (BatchNormali (None, None, None, 4 1792        conv4_block6_concat[0][0]        
__________________________________________________________________________________________________
conv4_block7_0_relu (Activation (None, None, None, 4 0           conv4_block7_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block7_1_conv (Conv2D)    (None, None, None, 1 57344       conv4_block7_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block7_1_bn (BatchNormali (None, None, None, 1 512         conv4_block7_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block7_1_relu (Activation (None, None, None, 1 0           conv4_block7_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block7_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block7_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block7_concat (Concatenat (None, None, None, 4 0           conv4_block6_concat[0][0]        
                                                                 conv4_block7_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block8_0_bn (BatchNormali (None, None, None, 4 1920        conv4_block7_concat[0][0]        
__________________________________________________________________________________________________
conv4_block8_0_relu (Activation (None, None, None, 4 0           conv4_block8_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block8_1_conv (Conv2D)    (None, None, None, 1 61440       conv4_block8_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block8_1_bn (BatchNormali (None, None, None, 1 512         conv4_block8_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block8_1_relu (Activation (None, None, None, 1 0           conv4_block8_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block8_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block8_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block8_concat (Concatenat (None, None, None, 5 0           conv4_block7_concat[0][0]        
                                                                 conv4_block8_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block9_0_bn (BatchNormali (None, None, None, 5 2048        conv4_block8_concat[0][0]        
__________________________________________________________________________________________________
conv4_block9_0_relu (Activation (None, None, None, 5 0           conv4_block9_0_bn[0][0]          
__________________________________________________________________________________________________
conv4_block9_1_conv (Conv2D)    (None, None, None, 1 65536       conv4_block9_0_relu[0][0]        
__________________________________________________________________________________________________
conv4_block9_1_bn (BatchNormali (None, None, None, 1 512         conv4_block9_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block9_1_relu (Activation (None, None, None, 1 0           conv4_block9_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block9_2_conv (Conv2D)    (None, None, None, 3 36864       conv4_block9_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block9_concat (Concatenat (None, None, None, 5 0           conv4_block8_concat[0][0]        
                                                                 conv4_block9_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block10_0_bn (BatchNormal (None, None, None, 5 2176        conv4_block9_concat[0][0]        
__________________________________________________________________________________________________
conv4_block10_0_relu (Activatio (None, None, None, 5 0           conv4_block10_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block10_1_conv (Conv2D)   (None, None, None, 1 69632       conv4_block10_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block10_1_bn (BatchNormal (None, None, None, 1 512         conv4_block10_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block10_1_relu (Activatio (None, None, None, 1 0           conv4_block10_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block10_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block10_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block10_concat (Concatena (None, None, None, 5 0           conv4_block9_concat[0][0]        
                                                                 conv4_block10_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block11_0_bn (BatchNormal (None, None, None, 5 2304        conv4_block10_concat[0][0]       
__________________________________________________________________________________________________
conv4_block11_0_relu (Activatio (None, None, None, 5 0           conv4_block11_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block11_1_conv (Conv2D)   (None, None, None, 1 73728       conv4_block11_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block11_1_bn (BatchNormal (None, None, None, 1 512         conv4_block11_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block11_1_relu (Activatio (None, None, None, 1 0           conv4_block11_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block11_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block11_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block11_concat (Concatena (None, None, None, 6 0           conv4_block10_concat[0][0]       
                                                                 conv4_block11_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block12_0_bn (BatchNormal (None, None, None, 6 2432        conv4_block11_concat[0][0]       
__________________________________________________________________________________________________
conv4_block12_0_relu (Activatio (None, None, None, 6 0           conv4_block12_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block12_1_conv (Conv2D)   (None, None, None, 1 77824       conv4_block12_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block12_1_bn (BatchNormal (None, None, None, 1 512         conv4_block12_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block12_1_relu (Activatio (None, None, None, 1 0           conv4_block12_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block12_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block12_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block12_concat (Concatena (None, None, None, 6 0           conv4_block11_concat[0][0]       
                                                                 conv4_block12_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block13_0_bn (BatchNormal (None, None, None, 6 2560        conv4_block12_concat[0][0]       
__________________________________________________________________________________________________
conv4_block13_0_relu (Activatio (None, None, None, 6 0           conv4_block13_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block13_1_conv (Conv2D)   (None, None, None, 1 81920       conv4_block13_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block13_1_bn (BatchNormal (None, None, None, 1 512         conv4_block13_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block13_1_relu (Activatio (None, None, None, 1 0           conv4_block13_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block13_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block13_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block13_concat (Concatena (None, None, None, 6 0           conv4_block12_concat[0][0]       
                                                                 conv4_block13_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block14_0_bn (BatchNormal (None, None, None, 6 2688        conv4_block13_concat[0][0]       
__________________________________________________________________________________________________
conv4_block14_0_relu (Activatio (None, None, None, 6 0           conv4_block14_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block14_1_conv (Conv2D)   (None, None, None, 1 86016       conv4_block14_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block14_1_bn (BatchNormal (None, None, None, 1 512         conv4_block14_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block14_1_relu (Activatio (None, None, None, 1 0           conv4_block14_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block14_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block14_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block14_concat (Concatena (None, None, None, 7 0           conv4_block13_concat[0][0]       
                                                                 conv4_block14_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block15_0_bn (BatchNormal (None, None, None, 7 2816        conv4_block14_concat[0][0]       
__________________________________________________________________________________________________
conv4_block15_0_relu (Activatio (None, None, None, 7 0           conv4_block15_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block15_1_conv (Conv2D)   (None, None, None, 1 90112       conv4_block15_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block15_1_bn (BatchNormal (None, None, None, 1 512         conv4_block15_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block15_1_relu (Activatio (None, None, None, 1 0           conv4_block15_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block15_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block15_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block15_concat (Concatena (None, None, None, 7 0           conv4_block14_concat[0][0]       
                                                                 conv4_block15_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block16_0_bn (BatchNormal (None, None, None, 7 2944        conv4_block15_concat[0][0]       
__________________________________________________________________________________________________
conv4_block16_0_relu (Activatio (None, None, None, 7 0           conv4_block16_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block16_1_conv (Conv2D)   (None, None, None, 1 94208       conv4_block16_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block16_1_bn (BatchNormal (None, None, None, 1 512         conv4_block16_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block16_1_relu (Activatio (None, None, None, 1 0           conv4_block16_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block16_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block16_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block16_concat (Concatena (None, None, None, 7 0           conv4_block15_concat[0][0]       
                                                                 conv4_block16_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block17_0_bn (BatchNormal (None, None, None, 7 3072        conv4_block16_concat[0][0]       
__________________________________________________________________________________________________
conv4_block17_0_relu (Activatio (None, None, None, 7 0           conv4_block17_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block17_1_conv (Conv2D)   (None, None, None, 1 98304       conv4_block17_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block17_1_bn (BatchNormal (None, None, None, 1 512         conv4_block17_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block17_1_relu (Activatio (None, None, None, 1 0           conv4_block17_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block17_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block17_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block17_concat (Concatena (None, None, None, 8 0           conv4_block16_concat[0][0]       
                                                                 conv4_block17_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block18_0_bn (BatchNormal (None, None, None, 8 3200        conv4_block17_concat[0][0]       
__________________________________________________________________________________________________
conv4_block18_0_relu (Activatio (None, None, None, 8 0           conv4_block18_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block18_1_conv (Conv2D)   (None, None, None, 1 102400      conv4_block18_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block18_1_bn (BatchNormal (None, None, None, 1 512         conv4_block18_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block18_1_relu (Activatio (None, None, None, 1 0           conv4_block18_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block18_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block18_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block18_concat (Concatena (None, None, None, 8 0           conv4_block17_concat[0][0]       
                                                                 conv4_block18_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block19_0_bn (BatchNormal (None, None, None, 8 3328        conv4_block18_concat[0][0]       
__________________________________________________________________________________________________
conv4_block19_0_relu (Activatio (None, None, None, 8 0           conv4_block19_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block19_1_conv (Conv2D)   (None, None, None, 1 106496      conv4_block19_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block19_1_bn (BatchNormal (None, None, None, 1 512         conv4_block19_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block19_1_relu (Activatio (None, None, None, 1 0           conv4_block19_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block19_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block19_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block19_concat (Concatena (None, None, None, 8 0           conv4_block18_concat[0][0]       
                                                                 conv4_block19_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block20_0_bn (BatchNormal (None, None, None, 8 3456        conv4_block19_concat[0][0]       
__________________________________________________________________________________________________
conv4_block20_0_relu (Activatio (None, None, None, 8 0           conv4_block20_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block20_1_conv (Conv2D)   (None, None, None, 1 110592      conv4_block20_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block20_1_bn (BatchNormal (None, None, None, 1 512         conv4_block20_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block20_1_relu (Activatio (None, None, None, 1 0           conv4_block20_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block20_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block20_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block20_concat (Concatena (None, None, None, 8 0           conv4_block19_concat[0][0]       
                                                                 conv4_block20_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block21_0_bn (BatchNormal (None, None, None, 8 3584        conv4_block20_concat[0][0]       
__________________________________________________________________________________________________
conv4_block21_0_relu (Activatio (None, None, None, 8 0           conv4_block21_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block21_1_conv (Conv2D)   (None, None, None, 1 114688      conv4_block21_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block21_1_bn (BatchNormal (None, None, None, 1 512         conv4_block21_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block21_1_relu (Activatio (None, None, None, 1 0           conv4_block21_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block21_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block21_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block21_concat (Concatena (None, None, None, 9 0           conv4_block20_concat[0][0]       
                                                                 conv4_block21_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block22_0_bn (BatchNormal (None, None, None, 9 3712        conv4_block21_concat[0][0]       
__________________________________________________________________________________________________
conv4_block22_0_relu (Activatio (None, None, None, 9 0           conv4_block22_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block22_1_conv (Conv2D)   (None, None, None, 1 118784      conv4_block22_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block22_1_bn (BatchNormal (None, None, None, 1 512         conv4_block22_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block22_1_relu (Activatio (None, None, None, 1 0           conv4_block22_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block22_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block22_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block22_concat (Concatena (None, None, None, 9 0           conv4_block21_concat[0][0]       
                                                                 conv4_block22_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block23_0_bn (BatchNormal (None, None, None, 9 3840        conv4_block22_concat[0][0]       
__________________________________________________________________________________________________
conv4_block23_0_relu (Activatio (None, None, None, 9 0           conv4_block23_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block23_1_conv (Conv2D)   (None, None, None, 1 122880      conv4_block23_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block23_1_bn (BatchNormal (None, None, None, 1 512         conv4_block23_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block23_1_relu (Activatio (None, None, None, 1 0           conv4_block23_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block23_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block23_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block23_concat (Concatena (None, None, None, 9 0           conv4_block22_concat[0][0]       
                                                                 conv4_block23_2_conv[0][0]       
__________________________________________________________________________________________________
conv4_block24_0_bn (BatchNormal (None, None, None, 9 3968        conv4_block23_concat[0][0]       
__________________________________________________________________________________________________
conv4_block24_0_relu (Activatio (None, None, None, 9 0           conv4_block24_0_bn[0][0]         
__________________________________________________________________________________________________
conv4_block24_1_conv (Conv2D)   (None, None, None, 1 126976      conv4_block24_0_relu[0][0]       
__________________________________________________________________________________________________
conv4_block24_1_bn (BatchNormal (None, None, None, 1 512         conv4_block24_1_conv[0][0]       
__________________________________________________________________________________________________
conv4_block24_1_relu (Activatio (None, None, None, 1 0           conv4_block24_1_bn[0][0]         
__________________________________________________________________________________________________
conv4_block24_2_conv (Conv2D)   (None, None, None, 3 36864       conv4_block24_1_relu[0][0]       
__________________________________________________________________________________________________
conv4_block24_concat (Concatena (None, None, None, 1 0           conv4_block23_concat[0][0]       
                                                                 conv4_block24_2_conv[0][0]       
__________________________________________________________________________________________________
pool4_bn (BatchNormalization)   (None, None, None, 1 4096        conv4_block24_concat[0][0]       
__________________________________________________________________________________________________
pool4_relu (Activation)         (None, None, None, 1 0           pool4_bn[0][0]                   
__________________________________________________________________________________________________
pool4_conv (Conv2D)             (None, None, None, 5 524288      pool4_relu[0][0]                 
__________________________________________________________________________________________________
pool4_pool (AveragePooling2D)   (None, None, None, 5 0           pool4_conv[0][0]                 
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, None, None, 5 2048        pool4_pool[0][0]                 
__________________________________________________________________________________________________
conv5_block1_0_relu (Activation (None, None, None, 5 0           conv5_block1_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D)    (None, None, None, 1 65536       conv5_block1_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, None, None, 1 512         conv5_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, None, None, 1 0           conv5_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_concat (Concatenat (None, None, None, 5 0           pool4_pool[0][0]                 
                                                                 conv5_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_0_bn (BatchNormali (None, None, None, 5 2176        conv5_block1_concat[0][0]        
__________________________________________________________________________________________________
conv5_block2_0_relu (Activation (None, None, None, 5 0           conv5_block2_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D)    (None, None, None, 1 69632       conv5_block2_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, None, None, 1 512         conv5_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, None, None, 1 0           conv5_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_concat (Concatenat (None, None, None, 5 0           conv5_block1_concat[0][0]        
                                                                 conv5_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_0_bn (BatchNormali (None, None, None, 5 2304        conv5_block2_concat[0][0]        
__________________________________________________________________________________________________
conv5_block3_0_relu (Activation (None, None, None, 5 0           conv5_block3_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D)    (None, None, None, 1 73728       conv5_block3_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, None, None, 1 512         conv5_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, None, None, 1 0           conv5_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_concat (Concatenat (None, None, None, 6 0           conv5_block2_concat[0][0]        
                                                                 conv5_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block4_0_bn (BatchNormali (None, None, None, 6 2432        conv5_block3_concat[0][0]        
__________________________________________________________________________________________________
conv5_block4_0_relu (Activation (None, None, None, 6 0           conv5_block4_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block4_1_conv (Conv2D)    (None, None, None, 1 77824       conv5_block4_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block4_1_bn (BatchNormali (None, None, None, 1 512         conv5_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block4_1_relu (Activation (None, None, None, 1 0           conv5_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block4_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block4_concat (Concatenat (None, None, None, 6 0           conv5_block3_concat[0][0]        
                                                                 conv5_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block5_0_bn (BatchNormali (None, None, None, 6 2560        conv5_block4_concat[0][0]        
__________________________________________________________________________________________________
conv5_block5_0_relu (Activation (None, None, None, 6 0           conv5_block5_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block5_1_conv (Conv2D)    (None, None, None, 1 81920       conv5_block5_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block5_1_bn (BatchNormali (None, None, None, 1 512         conv5_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block5_1_relu (Activation (None, None, None, 1 0           conv5_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block5_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block5_concat (Concatenat (None, None, None, 6 0           conv5_block4_concat[0][0]        
                                                                 conv5_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block6_0_bn (BatchNormali (None, None, None, 6 2688        conv5_block5_concat[0][0]        
__________________________________________________________________________________________________
conv5_block6_0_relu (Activation (None, None, None, 6 0           conv5_block6_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block6_1_conv (Conv2D)    (None, None, None, 1 86016       conv5_block6_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block6_1_bn (BatchNormali (None, None, None, 1 512         conv5_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block6_1_relu (Activation (None, None, None, 1 0           conv5_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block6_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block6_concat (Concatenat (None, None, None, 7 0           conv5_block5_concat[0][0]        
                                                                 conv5_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block7_0_bn (BatchNormali (None, None, None, 7 2816        conv5_block6_concat[0][0]        
__________________________________________________________________________________________________
conv5_block7_0_relu (Activation (None, None, None, 7 0           conv5_block7_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block7_1_conv (Conv2D)    (None, None, None, 1 90112       conv5_block7_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block7_1_bn (BatchNormali (None, None, None, 1 512         conv5_block7_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block7_1_relu (Activation (None, None, None, 1 0           conv5_block7_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block7_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block7_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block7_concat (Concatenat (None, None, None, 7 0           conv5_block6_concat[0][0]        
                                                                 conv5_block7_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block8_0_bn (BatchNormali (None, None, None, 7 2944        conv5_block7_concat[0][0]        
__________________________________________________________________________________________________
conv5_block8_0_relu (Activation (None, None, None, 7 0           conv5_block8_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block8_1_conv (Conv2D)    (None, None, None, 1 94208       conv5_block8_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block8_1_bn (BatchNormali (None, None, None, 1 512         conv5_block8_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block8_1_relu (Activation (None, None, None, 1 0           conv5_block8_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block8_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block8_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block8_concat (Concatenat (None, None, None, 7 0           conv5_block7_concat[0][0]        
                                                                 conv5_block8_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block9_0_bn (BatchNormali (None, None, None, 7 3072        conv5_block8_concat[0][0]        
__________________________________________________________________________________________________
conv5_block9_0_relu (Activation (None, None, None, 7 0           conv5_block9_0_bn[0][0]          
__________________________________________________________________________________________________
conv5_block9_1_conv (Conv2D)    (None, None, None, 1 98304       conv5_block9_0_relu[0][0]        
__________________________________________________________________________________________________
conv5_block9_1_bn (BatchNormali (None, None, None, 1 512         conv5_block9_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block9_1_relu (Activation (None, None, None, 1 0           conv5_block9_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block9_2_conv (Conv2D)    (None, None, None, 3 36864       conv5_block9_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block9_concat (Concatenat (None, None, None, 8 0           conv5_block8_concat[0][0]        
                                                                 conv5_block9_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block10_0_bn (BatchNormal (None, None, None, 8 3200        conv5_block9_concat[0][0]        
__________________________________________________________________________________________________
conv5_block10_0_relu (Activatio (None, None, None, 8 0           conv5_block10_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block10_1_conv (Conv2D)   (None, None, None, 1 102400      conv5_block10_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block10_1_bn (BatchNormal (None, None, None, 1 512         conv5_block10_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block10_1_relu (Activatio (None, None, None, 1 0           conv5_block10_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block10_2_conv (Conv2D)   (None, None, None, 3 36864       conv5_block10_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block10_concat (Concatena (None, None, None, 8 0           conv5_block9_concat[0][0]        
                                                                 conv5_block10_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block11_0_bn (BatchNormal (None, None, None, 8 3328        conv5_block10_concat[0][0]       
__________________________________________________________________________________________________
conv5_block11_0_relu (Activatio (None, None, None, 8 0           conv5_block11_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block11_1_conv (Conv2D)   (None, None, None, 1 106496      conv5_block11_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block11_1_bn (BatchNormal (None, None, None, 1 512         conv5_block11_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block11_1_relu (Activatio (None, None, None, 1 0           conv5_block11_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block11_2_conv (Conv2D)   (None, None, None, 3 36864       conv5_block11_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block11_concat (Concatena (None, None, None, 8 0           conv5_block10_concat[0][0]       
                                                                 conv5_block11_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block12_0_bn (BatchNormal (None, None, None, 8 3456        conv5_block11_concat[0][0]       
__________________________________________________________________________________________________
conv5_block12_0_relu (Activatio (None, None, None, 8 0           conv5_block12_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block12_1_conv (Conv2D)   (None, None, None, 1 110592      conv5_block12_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block12_1_bn (BatchNormal (None, None, None, 1 512         conv5_block12_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block12_1_relu (Activatio (None, None, None, 1 0           conv5_block12_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block12_2_conv (Conv2D)   (None, None, None, 3 36864       conv5_block12_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block12_concat (Concatena (None, None, None, 8 0           conv5_block11_concat[0][0]       
                                                                 conv5_block12_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block13_0_bn (BatchNormal (None, None, None, 8 3584        conv5_block12_concat[0][0]       
__________________________________________________________________________________________________
conv5_block13_0_relu (Activatio (None, None, None, 8 0           conv5_block13_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block13_1_conv (Conv2D)   (None, None, None, 1 114688      conv5_block13_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block13_1_bn (BatchNormal (None, None, None, 1 512         conv5_block13_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block13_1_relu (Activatio (None, None, None, 1 0           conv5_block13_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block13_2_conv (Conv2D)   (None, None, None, 3 36864       conv5_block13_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block13_concat (Concatena (None, None, None, 9 0           conv5_block12_concat[0][0]       
                                                                 conv5_block13_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block14_0_bn (BatchNormal (None, None, None, 9 3712        conv5_block13_concat[0][0]       
__________________________________________________________________________________________________
conv5_block14_0_relu (Activatio (None, None, None, 9 0           conv5_block14_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block14_1_conv (Conv2D)   (None, None, None, 1 118784      conv5_block14_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block14_1_bn (BatchNormal (None, None, None, 1 512         conv5_block14_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block14_1_relu (Activatio (None, None, None, 1 0           conv5_block14_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block14_2_conv (Conv2D)   (None, None, None, 3 36864       conv5_block14_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block14_concat (Concatena (None, None, None, 9 0           conv5_block13_concat[0][0]       
                                                                 conv5_block14_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block15_0_bn (BatchNormal (None, None, None, 9 3840        conv5_block14_concat[0][0]       
__________________________________________________________________________________________________
conv5_block15_0_relu (Activatio (None, None, None, 9 0           conv5_block15_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block15_1_conv (Conv2D)   (None, None, None, 1 122880      conv5_block15_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block15_1_bn (BatchNormal (None, None, None, 1 512         conv5_block15_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block15_1_relu (Activatio (None, None, None, 1 0           conv5_block15_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block15_2_conv (Conv2D)   (None, None, None, 3 36864       conv5_block15_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block15_concat (Concatena (None, None, None, 9 0           conv5_block14_concat[0][0]       
                                                                 conv5_block15_2_conv[0][0]       
__________________________________________________________________________________________________
conv5_block16_0_bn (BatchNormal (None, None, None, 9 3968        conv5_block15_concat[0][0]       
__________________________________________________________________________________________________
conv5_block16_0_relu (Activatio (None, None, None, 9 0           conv5_block16_0_bn[0][0]         
__________________________________________________________________________________________________
conv5_block16_1_conv (Conv2D)   (None, None, None, 1 126976      conv5_block16_0_relu[0][0]       
__________________________________________________________________________________________________
conv5_block16_1_bn (BatchNormal (None, None, None, 1 512         conv5_block16_1_conv[0][0]       
__________________________________________________________________________________________________
conv5_block16_1_relu (Activatio (None, None, None, 1 0           conv5_block16_1_bn[0][0]         
__________________________________________________________________________________________________
conv5_block16_2_conv (Conv2D)   (None, None, None, 3 36864       conv5_block16_1_relu[0][0]       
__________________________________________________________________________________________________
conv5_block16_concat (Concatena (None, None, None, 1 0           conv5_block15_concat[0][0]       
                                                                 conv5_block16_2_conv[0][0]       
__________________________________________________________________________________________________
bn (BatchNormalization)         (None, None, None, 1 4096        conv5_block16_concat[0][0]       
__________________________________________________________________________________________________
relu (Activation)               (None, None, None, 1 0           bn[0][0]                         
==================================================================================================
Total params: 7,037,504
Trainable params: 6,953,856
Non-trainable params: 83,648
__________________________________________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['conv2_block1_1_conv']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
41.249926805496216

The images are quite blurred so they are not longer what they used to be. I think it works best for the city image, the venus fly trap is too blurry, and the art piece is interesting but there is a lot of black.

ResNet50

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.resnet50.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.ResNet50(include_top=False, weights='imagenet')
base_model.summary()
Model: "resnet50"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_23 (InputLayer)           [(None, None, None,  0                                            
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D)       (None, None, None, 3 0           input_23[0][0]                   
__________________________________________________________________________________________________
conv1_conv (Conv2D)             (None, None, None, 6 9472        conv1_pad[0][0]                  
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, None, None, 6 256         conv1_conv[0][0]                 
__________________________________________________________________________________________________
conv1_relu (Activation)         (None, None, None, 6 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
pool1_pad (ZeroPadding2D)       (None, None, None, 6 0           conv1_relu[0][0]                 
__________________________________________________________________________________________________
pool1_pool (MaxPooling2D)       (None, None, None, 6 0           pool1_pad[0][0]                  
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D)    (None, None, None, 6 4160        pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, None, None, 6 256         conv2_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, None, None, 6 0           conv2_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D)    (None, None, None, 6 36928       conv2_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_bn (BatchNormali (None, None, None, 6 256         conv2_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_2_relu (Activation (None, None, None, 6 0           conv2_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_0_conv (Conv2D)    (None, None, None, 2 16640       pool1_pool[0][0]                 
__________________________________________________________________________________________________
conv2_block1_3_conv (Conv2D)    (None, None, None, 2 16640       conv2_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block1_0_bn (BatchNormali (None, None, None, 2 1024        conv2_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_3_bn (BatchNormali (None, None, None, 2 1024        conv2_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block1_add (Add)          (None, None, None, 2 0           conv2_block1_0_bn[0][0]          
                                                                 conv2_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block1_out (Activation)   (None, None, None, 2 0           conv2_block1_add[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D)    (None, None, None, 6 16448       conv2_block1_out[0][0]           
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, None, None, 6 256         conv2_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, None, None, 6 0           conv2_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D)    (None, None, None, 6 36928       conv2_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_bn (BatchNormali (None, None, None, 6 256         conv2_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_2_relu (Activation (None, None, None, 6 0           conv2_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_3_conv (Conv2D)    (None, None, None, 2 16640       conv2_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block2_3_bn (BatchNormali (None, None, None, 2 1024        conv2_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block2_add (Add)          (None, None, None, 2 0           conv2_block1_out[0][0]           
                                                                 conv2_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block2_out (Activation)   (None, None, None, 2 0           conv2_block2_add[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D)    (None, None, None, 6 16448       conv2_block2_out[0][0]           
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, None, None, 6 256         conv2_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, None, None, 6 0           conv2_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D)    (None, None, None, 6 36928       conv2_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_bn (BatchNormali (None, None, None, 6 256         conv2_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_2_relu (Activation (None, None, None, 6 0           conv2_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_3_conv (Conv2D)    (None, None, None, 2 16640       conv2_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv2_block3_3_bn (BatchNormali (None, None, None, 2 1024        conv2_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv2_block3_add (Add)          (None, None, None, 2 0           conv2_block2_out[0][0]           
                                                                 conv2_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv2_block3_out (Activation)   (None, None, None, 2 0           conv2_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D)    (None, None, None, 1 32896       conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, None, None, 1 512         conv3_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, None, None, 1 0           conv3_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D)    (None, None, None, 1 147584      conv3_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_bn (BatchNormali (None, None, None, 1 512         conv3_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_2_relu (Activation (None, None, None, 1 0           conv3_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_0_conv (Conv2D)    (None, None, None, 5 131584      conv2_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block1_3_conv (Conv2D)    (None, None, None, 5 66048       conv3_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block1_0_bn (BatchNormali (None, None, None, 5 2048        conv3_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_3_bn (BatchNormali (None, None, None, 5 2048        conv3_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block1_add (Add)          (None, None, None, 5 0           conv3_block1_0_bn[0][0]          
                                                                 conv3_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block1_out (Activation)   (None, None, None, 5 0           conv3_block1_add[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D)    (None, None, None, 1 65664       conv3_block1_out[0][0]           
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, None, None, 1 512         conv3_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, None, None, 1 0           conv3_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D)    (None, None, None, 1 147584      conv3_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_bn (BatchNormali (None, None, None, 1 512         conv3_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_2_relu (Activation (None, None, None, 1 0           conv3_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_3_conv (Conv2D)    (None, None, None, 5 66048       conv3_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block2_3_bn (BatchNormali (None, None, None, 5 2048        conv3_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block2_add (Add)          (None, None, None, 5 0           conv3_block1_out[0][0]           
                                                                 conv3_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block2_out (Activation)   (None, None, None, 5 0           conv3_block2_add[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D)    (None, None, None, 1 65664       conv3_block2_out[0][0]           
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, None, None, 1 512         conv3_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, None, None, 1 0           conv3_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D)    (None, None, None, 1 147584      conv3_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_bn (BatchNormali (None, None, None, 1 512         conv3_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_2_relu (Activation (None, None, None, 1 0           conv3_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_3_conv (Conv2D)    (None, None, None, 5 66048       conv3_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block3_3_bn (BatchNormali (None, None, None, 5 2048        conv3_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block3_add (Add)          (None, None, None, 5 0           conv3_block2_out[0][0]           
                                                                 conv3_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block3_out (Activation)   (None, None, None, 5 0           conv3_block3_add[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D)    (None, None, None, 1 65664       conv3_block3_out[0][0]           
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, None, None, 1 512         conv3_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, None, None, 1 0           conv3_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D)    (None, None, None, 1 147584      conv3_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_bn (BatchNormali (None, None, None, 1 512         conv3_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_2_relu (Activation (None, None, None, 1 0           conv3_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_3_conv (Conv2D)    (None, None, None, 5 66048       conv3_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv3_block4_3_bn (BatchNormali (None, None, None, 5 2048        conv3_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv3_block4_add (Add)          (None, None, None, 5 0           conv3_block3_out[0][0]           
                                                                 conv3_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv3_block4_out (Activation)   (None, None, None, 5 0           conv3_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D)    (None, None, None, 2 131328      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, None, None, 2 1024        conv4_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, None, None, 2 0           conv4_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D)    (None, None, None, 2 590080      conv4_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_bn (BatchNormali (None, None, None, 2 1024        conv4_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_2_relu (Activation (None, None, None, 2 0           conv4_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_0_conv (Conv2D)    (None, None, None, 1 525312      conv3_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block1_3_conv (Conv2D)    (None, None, None, 1 263168      conv4_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block1_0_bn (BatchNormali (None, None, None, 1 4096        conv4_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_3_bn (BatchNormali (None, None, None, 1 4096        conv4_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block1_add (Add)          (None, None, None, 1 0           conv4_block1_0_bn[0][0]          
                                                                 conv4_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block1_out (Activation)   (None, None, None, 1 0           conv4_block1_add[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D)    (None, None, None, 2 262400      conv4_block1_out[0][0]           
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, None, None, 2 1024        conv4_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, None, None, 2 0           conv4_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D)    (None, None, None, 2 590080      conv4_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_bn (BatchNormali (None, None, None, 2 1024        conv4_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_2_relu (Activation (None, None, None, 2 0           conv4_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_3_conv (Conv2D)    (None, None, None, 1 263168      conv4_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block2_3_bn (BatchNormali (None, None, None, 1 4096        conv4_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block2_add (Add)          (None, None, None, 1 0           conv4_block1_out[0][0]           
                                                                 conv4_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block2_out (Activation)   (None, None, None, 1 0           conv4_block2_add[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D)    (None, None, None, 2 262400      conv4_block2_out[0][0]           
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, None, None, 2 1024        conv4_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, None, None, 2 0           conv4_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D)    (None, None, None, 2 590080      conv4_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_bn (BatchNormali (None, None, None, 2 1024        conv4_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_2_relu (Activation (None, None, None, 2 0           conv4_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_3_conv (Conv2D)    (None, None, None, 1 263168      conv4_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block3_3_bn (BatchNormali (None, None, None, 1 4096        conv4_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block3_add (Add)          (None, None, None, 1 0           conv4_block2_out[0][0]           
                                                                 conv4_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block3_out (Activation)   (None, None, None, 1 0           conv4_block3_add[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D)    (None, None, None, 2 262400      conv4_block3_out[0][0]           
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, None, None, 2 1024        conv4_block4_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, None, None, 2 0           conv4_block4_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D)    (None, None, None, 2 590080      conv4_block4_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_bn (BatchNormali (None, None, None, 2 1024        conv4_block4_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_2_relu (Activation (None, None, None, 2 0           conv4_block4_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_3_conv (Conv2D)    (None, None, None, 1 263168      conv4_block4_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block4_3_bn (BatchNormali (None, None, None, 1 4096        conv4_block4_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block4_add (Add)          (None, None, None, 1 0           conv4_block3_out[0][0]           
                                                                 conv4_block4_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block4_out (Activation)   (None, None, None, 1 0           conv4_block4_add[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D)    (None, None, None, 2 262400      conv4_block4_out[0][0]           
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, None, None, 2 1024        conv4_block5_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, None, None, 2 0           conv4_block5_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D)    (None, None, None, 2 590080      conv4_block5_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_bn (BatchNormali (None, None, None, 2 1024        conv4_block5_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_2_relu (Activation (None, None, None, 2 0           conv4_block5_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_3_conv (Conv2D)    (None, None, None, 1 263168      conv4_block5_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block5_3_bn (BatchNormali (None, None, None, 1 4096        conv4_block5_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block5_add (Add)          (None, None, None, 1 0           conv4_block4_out[0][0]           
                                                                 conv4_block5_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block5_out (Activation)   (None, None, None, 1 0           conv4_block5_add[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D)    (None, None, None, 2 262400      conv4_block5_out[0][0]           
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, None, None, 2 1024        conv4_block6_1_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, None, None, 2 0           conv4_block6_1_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D)    (None, None, None, 2 590080      conv4_block6_1_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_bn (BatchNormali (None, None, None, 2 1024        conv4_block6_2_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_2_relu (Activation (None, None, None, 2 0           conv4_block6_2_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_3_conv (Conv2D)    (None, None, None, 1 263168      conv4_block6_2_relu[0][0]        
__________________________________________________________________________________________________
conv4_block6_3_bn (BatchNormali (None, None, None, 1 4096        conv4_block6_3_conv[0][0]        
__________________________________________________________________________________________________
conv4_block6_add (Add)          (None, None, None, 1 0           conv4_block5_out[0][0]           
                                                                 conv4_block6_3_bn[0][0]          
__________________________________________________________________________________________________
conv4_block6_out (Activation)   (None, None, None, 1 0           conv4_block6_add[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D)    (None, None, None, 5 524800      conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, None, None, 5 2048        conv5_block1_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, None, None, 5 0           conv5_block1_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D)    (None, None, None, 5 2359808     conv5_block1_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_bn (BatchNormali (None, None, None, 5 2048        conv5_block1_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_2_relu (Activation (None, None, None, 5 0           conv5_block1_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_0_conv (Conv2D)    (None, None, None, 2 2099200     conv4_block6_out[0][0]           
__________________________________________________________________________________________________
conv5_block1_3_conv (Conv2D)    (None, None, None, 2 1050624     conv5_block1_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block1_0_bn (BatchNormali (None, None, None, 2 8192        conv5_block1_0_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_3_bn (BatchNormali (None, None, None, 2 8192        conv5_block1_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block1_add (Add)          (None, None, None, 2 0           conv5_block1_0_bn[0][0]          
                                                                 conv5_block1_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block1_out (Activation)   (None, None, None, 2 0           conv5_block1_add[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D)    (None, None, None, 5 1049088     conv5_block1_out[0][0]           
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, None, None, 5 2048        conv5_block2_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, None, None, 5 0           conv5_block2_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D)    (None, None, None, 5 2359808     conv5_block2_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_bn (BatchNormali (None, None, None, 5 2048        conv5_block2_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_2_relu (Activation (None, None, None, 5 0           conv5_block2_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_3_conv (Conv2D)    (None, None, None, 2 1050624     conv5_block2_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block2_3_bn (BatchNormali (None, None, None, 2 8192        conv5_block2_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block2_add (Add)          (None, None, None, 2 0           conv5_block1_out[0][0]           
                                                                 conv5_block2_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block2_out (Activation)   (None, None, None, 2 0           conv5_block2_add[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D)    (None, None, None, 5 1049088     conv5_block2_out[0][0]           
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, None, None, 5 2048        conv5_block3_1_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, None, None, 5 0           conv5_block3_1_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D)    (None, None, None, 5 2359808     conv5_block3_1_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_bn (BatchNormali (None, None, None, 5 2048        conv5_block3_2_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_2_relu (Activation (None, None, None, 5 0           conv5_block3_2_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_3_conv (Conv2D)    (None, None, None, 2 1050624     conv5_block3_2_relu[0][0]        
__________________________________________________________________________________________________
conv5_block3_3_bn (BatchNormali (None, None, None, 2 8192        conv5_block3_3_conv[0][0]        
__________________________________________________________________________________________________
conv5_block3_add (Add)          (None, None, None, 2 0           conv5_block2_out[0][0]           
                                                                 conv5_block3_3_bn[0][0]          
__________________________________________________________________________________________________
conv5_block3_out (Activation)   (None, None, None, 2 0           conv5_block3_add[0][0]           
==================================================================================================
Total params: 23,587,712
Trainable params: 23,534,592
Non-trainable params: 53,120
__________________________________________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['conv2_block1_0_conv']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
40.785926818847656

I really like the last image (the art one). The colours look like a blue fire, or the city looks like it has very old school graphics.

Overview

As this is a preference, overall my favourites and the ones used will be:

Natural image: InceptionV3 (works well with the natural image)

Urban image: Xception (might be able to make it interesting like an old graphics image)

Art image: MobileNet (might be a cool novel application)

Layers

The top/upper layers should produce more geometric outputs, while the lower layers produce more abstract or biological shapes. I will try test at least one case with lower, one case with upper, and a combination or midpoint. The combination may not be done if one of the effects looks better (e.g. geometric). Then I will explore more around that point.

Image 1 (Natural image – Venus Fly Trap)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')
base_model.summary()
Model: "inception_v3"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_25 (InputLayer)           [(None, None, None,  0                                            
__________________________________________________________________________________________________
conv2d_400 (Conv2D)             (None, None, None, 3 864         input_25[0][0]                   
__________________________________________________________________________________________________
batch_normalization_400 (BatchN (None, None, None, 3 96          conv2d_400[0][0]                 
__________________________________________________________________________________________________
activation_636 (Activation)     (None, None, None, 3 0           batch_normalization_400[0][0]    
__________________________________________________________________________________________________
conv2d_401 (Conv2D)             (None, None, None, 3 9216        activation_636[0][0]             
__________________________________________________________________________________________________
batch_normalization_401 (BatchN (None, None, None, 3 96          conv2d_401[0][0]                 
__________________________________________________________________________________________________
activation_637 (Activation)     (None, None, None, 3 0           batch_normalization_401[0][0]    
__________________________________________________________________________________________________
conv2d_402 (Conv2D)             (None, None, None, 6 18432       activation_637[0][0]             
__________________________________________________________________________________________________
batch_normalization_402 (BatchN (None, None, None, 6 192         conv2d_402[0][0]                 
__________________________________________________________________________________________________
activation_638 (Activation)     (None, None, None, 6 0           batch_normalization_402[0][0]    
__________________________________________________________________________________________________
max_pooling2d_16 (MaxPooling2D) (None, None, None, 6 0           activation_638[0][0]             
__________________________________________________________________________________________________
conv2d_403 (Conv2D)             (None, None, None, 8 5120        max_pooling2d_16[0][0]           
__________________________________________________________________________________________________
batch_normalization_403 (BatchN (None, None, None, 8 240         conv2d_403[0][0]                 
__________________________________________________________________________________________________
activation_639 (Activation)     (None, None, None, 8 0           batch_normalization_403[0][0]    
__________________________________________________________________________________________________
conv2d_404 (Conv2D)             (None, None, None, 1 138240      activation_639[0][0]             
__________________________________________________________________________________________________
batch_normalization_404 (BatchN (None, None, None, 1 576         conv2d_404[0][0]                 
__________________________________________________________________________________________________
activation_640 (Activation)     (None, None, None, 1 0           batch_normalization_404[0][0]    
__________________________________________________________________________________________________
max_pooling2d_17 (MaxPooling2D) (None, None, None, 1 0           activation_640[0][0]             
__________________________________________________________________________________________________
conv2d_408 (Conv2D)             (None, None, None, 6 12288       max_pooling2d_17[0][0]           
__________________________________________________________________________________________________
batch_normalization_408 (BatchN (None, None, None, 6 192         conv2d_408[0][0]                 
__________________________________________________________________________________________________
activation_644 (Activation)     (None, None, None, 6 0           batch_normalization_408[0][0]    
__________________________________________________________________________________________________
conv2d_406 (Conv2D)             (None, None, None, 4 9216        max_pooling2d_17[0][0]           
__________________________________________________________________________________________________
conv2d_409 (Conv2D)             (None, None, None, 9 55296       activation_644[0][0]             
__________________________________________________________________________________________________
batch_normalization_406 (BatchN (None, None, None, 4 144         conv2d_406[0][0]                 
__________________________________________________________________________________________________
batch_normalization_409 (BatchN (None, None, None, 9 288         conv2d_409[0][0]                 
__________________________________________________________________________________________________
activation_642 (Activation)     (None, None, None, 4 0           batch_normalization_406[0][0]    
__________________________________________________________________________________________________
activation_645 (Activation)     (None, None, None, 9 0           batch_normalization_409[0][0]    
__________________________________________________________________________________________________
average_pooling2d_36 (AveragePo (None, None, None, 1 0           max_pooling2d_17[0][0]           
__________________________________________________________________________________________________
conv2d_405 (Conv2D)             (None, None, None, 6 12288       max_pooling2d_17[0][0]           
__________________________________________________________________________________________________
conv2d_407 (Conv2D)             (None, None, None, 6 76800       activation_642[0][0]             
__________________________________________________________________________________________________
conv2d_410 (Conv2D)             (None, None, None, 9 82944       activation_645[0][0]             
__________________________________________________________________________________________________
conv2d_411 (Conv2D)             (None, None, None, 3 6144        average_pooling2d_36[0][0]       
__________________________________________________________________________________________________
batch_normalization_405 (BatchN (None, None, None, 6 192         conv2d_405[0][0]                 
__________________________________________________________________________________________________
batch_normalization_407 (BatchN (None, None, None, 6 192         conv2d_407[0][0]                 
__________________________________________________________________________________________________
batch_normalization_410 (BatchN (None, None, None, 9 288         conv2d_410[0][0]                 
__________________________________________________________________________________________________
batch_normalization_411 (BatchN (None, None, None, 3 96          conv2d_411[0][0]                 
__________________________________________________________________________________________________
activation_641 (Activation)     (None, None, None, 6 0           batch_normalization_405[0][0]    
__________________________________________________________________________________________________
activation_643 (Activation)     (None, None, None, 6 0           batch_normalization_407[0][0]    
__________________________________________________________________________________________________
activation_646 (Activation)     (None, None, None, 9 0           batch_normalization_410[0][0]    
__________________________________________________________________________________________________
activation_647 (Activation)     (None, None, None, 3 0           batch_normalization_411[0][0]    
__________________________________________________________________________________________________
mixed0 (Concatenate)            (None, None, None, 2 0           activation_641[0][0]             
                                                                 activation_643[0][0]             
                                                                 activation_646[0][0]             
                                                                 activation_647[0][0]             
__________________________________________________________________________________________________
conv2d_415 (Conv2D)             (None, None, None, 6 16384       mixed0[0][0]                     
__________________________________________________________________________________________________
batch_normalization_415 (BatchN (None, None, None, 6 192         conv2d_415[0][0]                 
__________________________________________________________________________________________________
activation_651 (Activation)     (None, None, None, 6 0           batch_normalization_415[0][0]    
__________________________________________________________________________________________________
conv2d_413 (Conv2D)             (None, None, None, 4 12288       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_416 (Conv2D)             (None, None, None, 9 55296       activation_651[0][0]             
__________________________________________________________________________________________________
batch_normalization_413 (BatchN (None, None, None, 4 144         conv2d_413[0][0]                 
__________________________________________________________________________________________________
batch_normalization_416 (BatchN (None, None, None, 9 288         conv2d_416[0][0]                 
__________________________________________________________________________________________________
activation_649 (Activation)     (None, None, None, 4 0           batch_normalization_413[0][0]    
__________________________________________________________________________________________________
activation_652 (Activation)     (None, None, None, 9 0           batch_normalization_416[0][0]    
__________________________________________________________________________________________________
average_pooling2d_37 (AveragePo (None, None, None, 2 0           mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_412 (Conv2D)             (None, None, None, 6 16384       mixed0[0][0]                     
__________________________________________________________________________________________________
conv2d_414 (Conv2D)             (None, None, None, 6 76800       activation_649[0][0]             
__________________________________________________________________________________________________
conv2d_417 (Conv2D)             (None, None, None, 9 82944       activation_652[0][0]             
__________________________________________________________________________________________________
conv2d_418 (Conv2D)             (None, None, None, 6 16384       average_pooling2d_37[0][0]       
__________________________________________________________________________________________________
batch_normalization_412 (BatchN (None, None, None, 6 192         conv2d_412[0][0]                 
__________________________________________________________________________________________________
batch_normalization_414 (BatchN (None, None, None, 6 192         conv2d_414[0][0]                 
__________________________________________________________________________________________________
batch_normalization_417 (BatchN (None, None, None, 9 288         conv2d_417[0][0]                 
__________________________________________________________________________________________________
batch_normalization_418 (BatchN (None, None, None, 6 192         conv2d_418[0][0]                 
__________________________________________________________________________________________________
activation_648 (Activation)     (None, None, None, 6 0           batch_normalization_412[0][0]    
__________________________________________________________________________________________________
activation_650 (Activation)     (None, None, None, 6 0           batch_normalization_414[0][0]    
__________________________________________________________________________________________________
activation_653 (Activation)     (None, None, None, 9 0           batch_normalization_417[0][0]    
__________________________________________________________________________________________________
activation_654 (Activation)     (None, None, None, 6 0           batch_normalization_418[0][0]    
__________________________________________________________________________________________________
mixed1 (Concatenate)            (None, None, None, 2 0           activation_648[0][0]             
                                                                 activation_650[0][0]             
                                                                 activation_653[0][0]             
                                                                 activation_654[0][0]             
__________________________________________________________________________________________________
conv2d_422 (Conv2D)             (None, None, None, 6 18432       mixed1[0][0]                     
__________________________________________________________________________________________________
batch_normalization_422 (BatchN (None, None, None, 6 192         conv2d_422[0][0]                 
__________________________________________________________________________________________________
activation_658 (Activation)     (None, None, None, 6 0           batch_normalization_422[0][0]    
__________________________________________________________________________________________________
conv2d_420 (Conv2D)             (None, None, None, 4 13824       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_423 (Conv2D)             (None, None, None, 9 55296       activation_658[0][0]             
__________________________________________________________________________________________________
batch_normalization_420 (BatchN (None, None, None, 4 144         conv2d_420[0][0]                 
__________________________________________________________________________________________________
batch_normalization_423 (BatchN (None, None, None, 9 288         conv2d_423[0][0]                 
__________________________________________________________________________________________________
activation_656 (Activation)     (None, None, None, 4 0           batch_normalization_420[0][0]    
__________________________________________________________________________________________________
activation_659 (Activation)     (None, None, None, 9 0           batch_normalization_423[0][0]    
__________________________________________________________________________________________________
average_pooling2d_38 (AveragePo (None, None, None, 2 0           mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_419 (Conv2D)             (None, None, None, 6 18432       mixed1[0][0]                     
__________________________________________________________________________________________________
conv2d_421 (Conv2D)             (None, None, None, 6 76800       activation_656[0][0]             
__________________________________________________________________________________________________
conv2d_424 (Conv2D)             (None, None, None, 9 82944       activation_659[0][0]             
__________________________________________________________________________________________________
conv2d_425 (Conv2D)             (None, None, None, 6 18432       average_pooling2d_38[0][0]       
__________________________________________________________________________________________________
batch_normalization_419 (BatchN (None, None, None, 6 192         conv2d_419[0][0]                 
__________________________________________________________________________________________________
batch_normalization_421 (BatchN (None, None, None, 6 192         conv2d_421[0][0]                 
__________________________________________________________________________________________________
batch_normalization_424 (BatchN (None, None, None, 9 288         conv2d_424[0][0]                 
__________________________________________________________________________________________________
batch_normalization_425 (BatchN (None, None, None, 6 192         conv2d_425[0][0]                 
__________________________________________________________________________________________________
activation_655 (Activation)     (None, None, None, 6 0           batch_normalization_419[0][0]    
__________________________________________________________________________________________________
activation_657 (Activation)     (None, None, None, 6 0           batch_normalization_421[0][0]    
__________________________________________________________________________________________________
activation_660 (Activation)     (None, None, None, 9 0           batch_normalization_424[0][0]    
__________________________________________________________________________________________________
activation_661 (Activation)     (None, None, None, 6 0           batch_normalization_425[0][0]    
__________________________________________________________________________________________________
mixed2 (Concatenate)            (None, None, None, 2 0           activation_655[0][0]             
                                                                 activation_657[0][0]             
                                                                 activation_660[0][0]             
                                                                 activation_661[0][0]             
__________________________________________________________________________________________________
conv2d_427 (Conv2D)             (None, None, None, 6 18432       mixed2[0][0]                     
__________________________________________________________________________________________________
batch_normalization_427 (BatchN (None, None, None, 6 192         conv2d_427[0][0]                 
__________________________________________________________________________________________________
activation_663 (Activation)     (None, None, None, 6 0           batch_normalization_427[0][0]    
__________________________________________________________________________________________________
conv2d_428 (Conv2D)             (None, None, None, 9 55296       activation_663[0][0]             
__________________________________________________________________________________________________
batch_normalization_428 (BatchN (None, None, None, 9 288         conv2d_428[0][0]                 
__________________________________________________________________________________________________
activation_664 (Activation)     (None, None, None, 9 0           batch_normalization_428[0][0]    
__________________________________________________________________________________________________
conv2d_426 (Conv2D)             (None, None, None, 3 995328      mixed2[0][0]                     
__________________________________________________________________________________________________
conv2d_429 (Conv2D)             (None, None, None, 9 82944       activation_664[0][0]             
__________________________________________________________________________________________________
batch_normalization_426 (BatchN (None, None, None, 3 1152        conv2d_426[0][0]                 
__________________________________________________________________________________________________
batch_normalization_429 (BatchN (None, None, None, 9 288         conv2d_429[0][0]                 
__________________________________________________________________________________________________
activation_662 (Activation)     (None, None, None, 3 0           batch_normalization_426[0][0]    
__________________________________________________________________________________________________
activation_665 (Activation)     (None, None, None, 9 0           batch_normalization_429[0][0]    
__________________________________________________________________________________________________
max_pooling2d_18 (MaxPooling2D) (None, None, None, 2 0           mixed2[0][0]                     
__________________________________________________________________________________________________
mixed3 (Concatenate)            (None, None, None, 7 0           activation_662[0][0]             
                                                                 activation_665[0][0]             
                                                                 max_pooling2d_18[0][0]           
__________________________________________________________________________________________________
conv2d_434 (Conv2D)             (None, None, None, 1 98304       mixed3[0][0]                     
__________________________________________________________________________________________________
batch_normalization_434 (BatchN (None, None, None, 1 384         conv2d_434[0][0]                 
__________________________________________________________________________________________________
activation_670 (Activation)     (None, None, None, 1 0           batch_normalization_434[0][0]    
__________________________________________________________________________________________________
conv2d_435 (Conv2D)             (None, None, None, 1 114688      activation_670[0][0]             
__________________________________________________________________________________________________
batch_normalization_435 (BatchN (None, None, None, 1 384         conv2d_435[0][0]                 
__________________________________________________________________________________________________
activation_671 (Activation)     (None, None, None, 1 0           batch_normalization_435[0][0]    
__________________________________________________________________________________________________
conv2d_431 (Conv2D)             (None, None, None, 1 98304       mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_436 (Conv2D)             (None, None, None, 1 114688      activation_671[0][0]             
__________________________________________________________________________________________________
batch_normalization_431 (BatchN (None, None, None, 1 384         conv2d_431[0][0]                 
__________________________________________________________________________________________________
batch_normalization_436 (BatchN (None, None, None, 1 384         conv2d_436[0][0]                 
__________________________________________________________________________________________________
activation_667 (Activation)     (None, None, None, 1 0           batch_normalization_431[0][0]    
__________________________________________________________________________________________________
activation_672 (Activation)     (None, None, None, 1 0           batch_normalization_436[0][0]    
__________________________________________________________________________________________________
conv2d_432 (Conv2D)             (None, None, None, 1 114688      activation_667[0][0]             
__________________________________________________________________________________________________
conv2d_437 (Conv2D)             (None, None, None, 1 114688      activation_672[0][0]             
__________________________________________________________________________________________________
batch_normalization_432 (BatchN (None, None, None, 1 384         conv2d_432[0][0]                 
__________________________________________________________________________________________________
batch_normalization_437 (BatchN (None, None, None, 1 384         conv2d_437[0][0]                 
__________________________________________________________________________________________________
activation_668 (Activation)     (None, None, None, 1 0           batch_normalization_432[0][0]    
__________________________________________________________________________________________________
activation_673 (Activation)     (None, None, None, 1 0           batch_normalization_437[0][0]    
__________________________________________________________________________________________________
average_pooling2d_39 (AveragePo (None, None, None, 7 0           mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_430 (Conv2D)             (None, None, None, 1 147456      mixed3[0][0]                     
__________________________________________________________________________________________________
conv2d_433 (Conv2D)             (None, None, None, 1 172032      activation_668[0][0]             
__________________________________________________________________________________________________
conv2d_438 (Conv2D)             (None, None, None, 1 172032      activation_673[0][0]             
__________________________________________________________________________________________________
conv2d_439 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_39[0][0]       
__________________________________________________________________________________________________
batch_normalization_430 (BatchN (None, None, None, 1 576         conv2d_430[0][0]                 
__________________________________________________________________________________________________
batch_normalization_433 (BatchN (None, None, None, 1 576         conv2d_433[0][0]                 
__________________________________________________________________________________________________
batch_normalization_438 (BatchN (None, None, None, 1 576         conv2d_438[0][0]                 
__________________________________________________________________________________________________
batch_normalization_439 (BatchN (None, None, None, 1 576         conv2d_439[0][0]                 
__________________________________________________________________________________________________
activation_666 (Activation)     (None, None, None, 1 0           batch_normalization_430[0][0]    
__________________________________________________________________________________________________
activation_669 (Activation)     (None, None, None, 1 0           batch_normalization_433[0][0]    
__________________________________________________________________________________________________
activation_674 (Activation)     (None, None, None, 1 0           batch_normalization_438[0][0]    
__________________________________________________________________________________________________
activation_675 (Activation)     (None, None, None, 1 0           batch_normalization_439[0][0]    
__________________________________________________________________________________________________
mixed4 (Concatenate)            (None, None, None, 7 0           activation_666[0][0]             
                                                                 activation_669[0][0]             
                                                                 activation_674[0][0]             
                                                                 activation_675[0][0]             
__________________________________________________________________________________________________
conv2d_444 (Conv2D)             (None, None, None, 1 122880      mixed4[0][0]                     
__________________________________________________________________________________________________
batch_normalization_444 (BatchN (None, None, None, 1 480         conv2d_444[0][0]                 
__________________________________________________________________________________________________
activation_680 (Activation)     (None, None, None, 1 0           batch_normalization_444[0][0]    
__________________________________________________________________________________________________
conv2d_445 (Conv2D)             (None, None, None, 1 179200      activation_680[0][0]             
__________________________________________________________________________________________________
batch_normalization_445 (BatchN (None, None, None, 1 480         conv2d_445[0][0]                 
__________________________________________________________________________________________________
activation_681 (Activation)     (None, None, None, 1 0           batch_normalization_445[0][0]    
__________________________________________________________________________________________________
conv2d_441 (Conv2D)             (None, None, None, 1 122880      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_446 (Conv2D)             (None, None, None, 1 179200      activation_681[0][0]             
__________________________________________________________________________________________________
batch_normalization_441 (BatchN (None, None, None, 1 480         conv2d_441[0][0]                 
__________________________________________________________________________________________________
batch_normalization_446 (BatchN (None, None, None, 1 480         conv2d_446[0][0]                 
__________________________________________________________________________________________________
activation_677 (Activation)     (None, None, None, 1 0           batch_normalization_441[0][0]    
__________________________________________________________________________________________________
activation_682 (Activation)     (None, None, None, 1 0           batch_normalization_446[0][0]    
__________________________________________________________________________________________________
conv2d_442 (Conv2D)             (None, None, None, 1 179200      activation_677[0][0]             
__________________________________________________________________________________________________
conv2d_447 (Conv2D)             (None, None, None, 1 179200      activation_682[0][0]             
__________________________________________________________________________________________________
batch_normalization_442 (BatchN (None, None, None, 1 480         conv2d_442[0][0]                 
__________________________________________________________________________________________________
batch_normalization_447 (BatchN (None, None, None, 1 480         conv2d_447[0][0]                 
__________________________________________________________________________________________________
activation_678 (Activation)     (None, None, None, 1 0           batch_normalization_442[0][0]    
__________________________________________________________________________________________________
activation_683 (Activation)     (None, None, None, 1 0           batch_normalization_447[0][0]    
__________________________________________________________________________________________________
average_pooling2d_40 (AveragePo (None, None, None, 7 0           mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_440 (Conv2D)             (None, None, None, 1 147456      mixed4[0][0]                     
__________________________________________________________________________________________________
conv2d_443 (Conv2D)             (None, None, None, 1 215040      activation_678[0][0]             
__________________________________________________________________________________________________
conv2d_448 (Conv2D)             (None, None, None, 1 215040      activation_683[0][0]             
__________________________________________________________________________________________________
conv2d_449 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_40[0][0]       
__________________________________________________________________________________________________
batch_normalization_440 (BatchN (None, None, None, 1 576         conv2d_440[0][0]                 
__________________________________________________________________________________________________
batch_normalization_443 (BatchN (None, None, None, 1 576         conv2d_443[0][0]                 
__________________________________________________________________________________________________
batch_normalization_448 (BatchN (None, None, None, 1 576         conv2d_448[0][0]                 
__________________________________________________________________________________________________
batch_normalization_449 (BatchN (None, None, None, 1 576         conv2d_449[0][0]                 
__________________________________________________________________________________________________
activation_676 (Activation)     (None, None, None, 1 0           batch_normalization_440[0][0]    
__________________________________________________________________________________________________
activation_679 (Activation)     (None, None, None, 1 0           batch_normalization_443[0][0]    
__________________________________________________________________________________________________
activation_684 (Activation)     (None, None, None, 1 0           batch_normalization_448[0][0]    
__________________________________________________________________________________________________
activation_685 (Activation)     (None, None, None, 1 0           batch_normalization_449[0][0]    
__________________________________________________________________________________________________
mixed5 (Concatenate)            (None, None, None, 7 0           activation_676[0][0]             
                                                                 activation_679[0][0]             
                                                                 activation_684[0][0]             
                                                                 activation_685[0][0]             
__________________________________________________________________________________________________
conv2d_454 (Conv2D)             (None, None, None, 1 122880      mixed5[0][0]                     
__________________________________________________________________________________________________
batch_normalization_454 (BatchN (None, None, None, 1 480         conv2d_454[0][0]                 
__________________________________________________________________________________________________
activation_690 (Activation)     (None, None, None, 1 0           batch_normalization_454[0][0]    
__________________________________________________________________________________________________
conv2d_455 (Conv2D)             (None, None, None, 1 179200      activation_690[0][0]             
__________________________________________________________________________________________________
batch_normalization_455 (BatchN (None, None, None, 1 480         conv2d_455[0][0]                 
__________________________________________________________________________________________________
activation_691 (Activation)     (None, None, None, 1 0           batch_normalization_455[0][0]    
__________________________________________________________________________________________________
conv2d_451 (Conv2D)             (None, None, None, 1 122880      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_456 (Conv2D)             (None, None, None, 1 179200      activation_691[0][0]             
__________________________________________________________________________________________________
batch_normalization_451 (BatchN (None, None, None, 1 480         conv2d_451[0][0]                 
__________________________________________________________________________________________________
batch_normalization_456 (BatchN (None, None, None, 1 480         conv2d_456[0][0]                 
__________________________________________________________________________________________________
activation_687 (Activation)     (None, None, None, 1 0           batch_normalization_451[0][0]    
__________________________________________________________________________________________________
activation_692 (Activation)     (None, None, None, 1 0           batch_normalization_456[0][0]    
__________________________________________________________________________________________________
conv2d_452 (Conv2D)             (None, None, None, 1 179200      activation_687[0][0]             
__________________________________________________________________________________________________
conv2d_457 (Conv2D)             (None, None, None, 1 179200      activation_692[0][0]             
__________________________________________________________________________________________________
batch_normalization_452 (BatchN (None, None, None, 1 480         conv2d_452[0][0]                 
__________________________________________________________________________________________________
batch_normalization_457 (BatchN (None, None, None, 1 480         conv2d_457[0][0]                 
__________________________________________________________________________________________________
activation_688 (Activation)     (None, None, None, 1 0           batch_normalization_452[0][0]    
__________________________________________________________________________________________________
activation_693 (Activation)     (None, None, None, 1 0           batch_normalization_457[0][0]    
__________________________________________________________________________________________________
average_pooling2d_41 (AveragePo (None, None, None, 7 0           mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_450 (Conv2D)             (None, None, None, 1 147456      mixed5[0][0]                     
__________________________________________________________________________________________________
conv2d_453 (Conv2D)             (None, None, None, 1 215040      activation_688[0][0]             
__________________________________________________________________________________________________
conv2d_458 (Conv2D)             (None, None, None, 1 215040      activation_693[0][0]             
__________________________________________________________________________________________________
conv2d_459 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_41[0][0]       
__________________________________________________________________________________________________
batch_normalization_450 (BatchN (None, None, None, 1 576         conv2d_450[0][0]                 
__________________________________________________________________________________________________
batch_normalization_453 (BatchN (None, None, None, 1 576         conv2d_453[0][0]                 
__________________________________________________________________________________________________
batch_normalization_458 (BatchN (None, None, None, 1 576         conv2d_458[0][0]                 
__________________________________________________________________________________________________
batch_normalization_459 (BatchN (None, None, None, 1 576         conv2d_459[0][0]                 
__________________________________________________________________________________________________
activation_686 (Activation)     (None, None, None, 1 0           batch_normalization_450[0][0]    
__________________________________________________________________________________________________
activation_689 (Activation)     (None, None, None, 1 0           batch_normalization_453[0][0]    
__________________________________________________________________________________________________
activation_694 (Activation)     (None, None, None, 1 0           batch_normalization_458[0][0]    
__________________________________________________________________________________________________
activation_695 (Activation)     (None, None, None, 1 0           batch_normalization_459[0][0]    
__________________________________________________________________________________________________
mixed6 (Concatenate)            (None, None, None, 7 0           activation_686[0][0]             
                                                                 activation_689[0][0]             
                                                                 activation_694[0][0]             
                                                                 activation_695[0][0]             
__________________________________________________________________________________________________
conv2d_464 (Conv2D)             (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
batch_normalization_464 (BatchN (None, None, None, 1 576         conv2d_464[0][0]                 
__________________________________________________________________________________________________
activation_700 (Activation)     (None, None, None, 1 0           batch_normalization_464[0][0]    
__________________________________________________________________________________________________
conv2d_465 (Conv2D)             (None, None, None, 1 258048      activation_700[0][0]             
__________________________________________________________________________________________________
batch_normalization_465 (BatchN (None, None, None, 1 576         conv2d_465[0][0]                 
__________________________________________________________________________________________________
activation_701 (Activation)     (None, None, None, 1 0           batch_normalization_465[0][0]    
__________________________________________________________________________________________________
conv2d_461 (Conv2D)             (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_466 (Conv2D)             (None, None, None, 1 258048      activation_701[0][0]             
__________________________________________________________________________________________________
batch_normalization_461 (BatchN (None, None, None, 1 576         conv2d_461[0][0]                 
__________________________________________________________________________________________________
batch_normalization_466 (BatchN (None, None, None, 1 576         conv2d_466[0][0]                 
__________________________________________________________________________________________________
activation_697 (Activation)     (None, None, None, 1 0           batch_normalization_461[0][0]    
__________________________________________________________________________________________________
activation_702 (Activation)     (None, None, None, 1 0           batch_normalization_466[0][0]    
__________________________________________________________________________________________________
conv2d_462 (Conv2D)             (None, None, None, 1 258048      activation_697[0][0]             
__________________________________________________________________________________________________
conv2d_467 (Conv2D)             (None, None, None, 1 258048      activation_702[0][0]             
__________________________________________________________________________________________________
batch_normalization_462 (BatchN (None, None, None, 1 576         conv2d_462[0][0]                 
__________________________________________________________________________________________________
batch_normalization_467 (BatchN (None, None, None, 1 576         conv2d_467[0][0]                 
__________________________________________________________________________________________________
activation_698 (Activation)     (None, None, None, 1 0           batch_normalization_462[0][0]    
__________________________________________________________________________________________________
activation_703 (Activation)     (None, None, None, 1 0           batch_normalization_467[0][0]    
__________________________________________________________________________________________________
average_pooling2d_42 (AveragePo (None, None, None, 7 0           mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_460 (Conv2D)             (None, None, None, 1 147456      mixed6[0][0]                     
__________________________________________________________________________________________________
conv2d_463 (Conv2D)             (None, None, None, 1 258048      activation_698[0][0]             
__________________________________________________________________________________________________
conv2d_468 (Conv2D)             (None, None, None, 1 258048      activation_703[0][0]             
__________________________________________________________________________________________________
conv2d_469 (Conv2D)             (None, None, None, 1 147456      average_pooling2d_42[0][0]       
__________________________________________________________________________________________________
batch_normalization_460 (BatchN (None, None, None, 1 576         conv2d_460[0][0]                 
__________________________________________________________________________________________________
batch_normalization_463 (BatchN (None, None, None, 1 576         conv2d_463[0][0]                 
__________________________________________________________________________________________________
batch_normalization_468 (BatchN (None, None, None, 1 576         conv2d_468[0][0]                 
__________________________________________________________________________________________________
batch_normalization_469 (BatchN (None, None, None, 1 576         conv2d_469[0][0]                 
__________________________________________________________________________________________________
activation_696 (Activation)     (None, None, None, 1 0           batch_normalization_460[0][0]    
__________________________________________________________________________________________________
activation_699 (Activation)     (None, None, None, 1 0           batch_normalization_463[0][0]    
__________________________________________________________________________________________________
activation_704 (Activation)     (None, None, None, 1 0           batch_normalization_468[0][0]    
__________________________________________________________________________________________________
activation_705 (Activation)     (None, None, None, 1 0           batch_normalization_469[0][0]    
__________________________________________________________________________________________________
mixed7 (Concatenate)            (None, None, None, 7 0           activation_696[0][0]             
                                                                 activation_699[0][0]             
                                                                 activation_704[0][0]             
                                                                 activation_705[0][0]             
__________________________________________________________________________________________________
conv2d_472 (Conv2D)             (None, None, None, 1 147456      mixed7[0][0]                     
__________________________________________________________________________________________________
batch_normalization_472 (BatchN (None, None, None, 1 576         conv2d_472[0][0]                 
__________________________________________________________________________________________________
activation_708 (Activation)     (None, None, None, 1 0           batch_normalization_472[0][0]    
__________________________________________________________________________________________________
conv2d_473 (Conv2D)             (None, None, None, 1 258048      activation_708[0][0]             
__________________________________________________________________________________________________
batch_normalization_473 (BatchN (None, None, None, 1 576         conv2d_473[0][0]                 
__________________________________________________________________________________________________
activation_709 (Activation)     (None, None, None, 1 0           batch_normalization_473[0][0]    
__________________________________________________________________________________________________
conv2d_470 (Conv2D)             (None, None, None, 1 147456      mixed7[0][0]                     
__________________________________________________________________________________________________
conv2d_474 (Conv2D)             (None, None, None, 1 258048      activation_709[0][0]             
__________________________________________________________________________________________________
batch_normalization_470 (BatchN (None, None, None, 1 576         conv2d_470[0][0]                 
__________________________________________________________________________________________________
batch_normalization_474 (BatchN (None, None, None, 1 576         conv2d_474[0][0]                 
__________________________________________________________________________________________________
activation_706 (Activation)     (None, None, None, 1 0           batch_normalization_470[0][0]    
__________________________________________________________________________________________________
activation_710 (Activation)     (None, None, None, 1 0           batch_normalization_474[0][0]    
__________________________________________________________________________________________________
conv2d_471 (Conv2D)             (None, None, None, 3 552960      activation_706[0][0]             
__________________________________________________________________________________________________
conv2d_475 (Conv2D)             (None, None, None, 1 331776      activation_710[0][0]             
__________________________________________________________________________________________________
batch_normalization_471 (BatchN (None, None, None, 3 960         conv2d_471[0][0]                 
__________________________________________________________________________________________________
batch_normalization_475 (BatchN (None, None, None, 1 576         conv2d_475[0][0]                 
__________________________________________________________________________________________________
activation_707 (Activation)     (None, None, None, 3 0           batch_normalization_471[0][0]    
__________________________________________________________________________________________________
activation_711 (Activation)     (None, None, None, 1 0           batch_normalization_475[0][0]    
__________________________________________________________________________________________________
max_pooling2d_19 (MaxPooling2D) (None, None, None, 7 0           mixed7[0][0]                     
__________________________________________________________________________________________________
mixed8 (Concatenate)            (None, None, None, 1 0           activation_707[0][0]             
                                                                 activation_711[0][0]             
                                                                 max_pooling2d_19[0][0]           
__________________________________________________________________________________________________
conv2d_480 (Conv2D)             (None, None, None, 4 573440      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_480 (BatchN (None, None, None, 4 1344        conv2d_480[0][0]                 
__________________________________________________________________________________________________
activation_716 (Activation)     (None, None, None, 4 0           batch_normalization_480[0][0]    
__________________________________________________________________________________________________
conv2d_477 (Conv2D)             (None, None, None, 3 491520      mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_481 (Conv2D)             (None, None, None, 3 1548288     activation_716[0][0]             
__________________________________________________________________________________________________
batch_normalization_477 (BatchN (None, None, None, 3 1152        conv2d_477[0][0]                 
__________________________________________________________________________________________________
batch_normalization_481 (BatchN (None, None, None, 3 1152        conv2d_481[0][0]                 
__________________________________________________________________________________________________
activation_713 (Activation)     (None, None, None, 3 0           batch_normalization_477[0][0]    
__________________________________________________________________________________________________
activation_717 (Activation)     (None, None, None, 3 0           batch_normalization_481[0][0]    
__________________________________________________________________________________________________
conv2d_478 (Conv2D)             (None, None, None, 3 442368      activation_713[0][0]             
__________________________________________________________________________________________________
conv2d_479 (Conv2D)             (None, None, None, 3 442368      activation_713[0][0]             
__________________________________________________________________________________________________
conv2d_482 (Conv2D)             (None, None, None, 3 442368      activation_717[0][0]             
__________________________________________________________________________________________________
conv2d_483 (Conv2D)             (None, None, None, 3 442368      activation_717[0][0]             
__________________________________________________________________________________________________
average_pooling2d_43 (AveragePo (None, None, None, 1 0           mixed8[0][0]                     
__________________________________________________________________________________________________
conv2d_476 (Conv2D)             (None, None, None, 3 409600      mixed8[0][0]                     
__________________________________________________________________________________________________
batch_normalization_478 (BatchN (None, None, None, 3 1152        conv2d_478[0][0]                 
__________________________________________________________________________________________________
batch_normalization_479 (BatchN (None, None, None, 3 1152        conv2d_479[0][0]                 
__________________________________________________________________________________________________
batch_normalization_482 (BatchN (None, None, None, 3 1152        conv2d_482[0][0]                 
__________________________________________________________________________________________________
batch_normalization_483 (BatchN (None, None, None, 3 1152        conv2d_483[0][0]                 
__________________________________________________________________________________________________
conv2d_484 (Conv2D)             (None, None, None, 1 245760      average_pooling2d_43[0][0]       
__________________________________________________________________________________________________
batch_normalization_476 (BatchN (None, None, None, 3 960         conv2d_476[0][0]                 
__________________________________________________________________________________________________
activation_714 (Activation)     (None, None, None, 3 0           batch_normalization_478[0][0]    
__________________________________________________________________________________________________
activation_715 (Activation)     (None, None, None, 3 0           batch_normalization_479[0][0]    
__________________________________________________________________________________________________
activation_718 (Activation)     (None, None, None, 3 0           batch_normalization_482[0][0]    
__________________________________________________________________________________________________
activation_719 (Activation)     (None, None, None, 3 0           batch_normalization_483[0][0]    
__________________________________________________________________________________________________
batch_normalization_484 (BatchN (None, None, None, 1 576         conv2d_484[0][0]                 
__________________________________________________________________________________________________
activation_712 (Activation)     (None, None, None, 3 0           batch_normalization_476[0][0]    
__________________________________________________________________________________________________
mixed9_0 (Concatenate)          (None, None, None, 7 0           activation_714[0][0]             
                                                                 activation_715[0][0]             
__________________________________________________________________________________________________
concatenate_12 (Concatenate)    (None, None, None, 7 0           activation_718[0][0]             
                                                                 activation_719[0][0]             
__________________________________________________________________________________________________
activation_720 (Activation)     (None, None, None, 1 0           batch_normalization_484[0][0]    
__________________________________________________________________________________________________
mixed9 (Concatenate)            (None, None, None, 2 0           activation_712[0][0]             
                                                                 mixed9_0[0][0]                   
                                                                 concatenate_12[0][0]             
                                                                 activation_720[0][0]             
__________________________________________________________________________________________________
conv2d_489 (Conv2D)             (None, None, None, 4 917504      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_489 (BatchN (None, None, None, 4 1344        conv2d_489[0][0]                 
__________________________________________________________________________________________________
activation_725 (Activation)     (None, None, None, 4 0           batch_normalization_489[0][0]    
__________________________________________________________________________________________________
conv2d_486 (Conv2D)             (None, None, None, 3 786432      mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_490 (Conv2D)             (None, None, None, 3 1548288     activation_725[0][0]             
__________________________________________________________________________________________________
batch_normalization_486 (BatchN (None, None, None, 3 1152        conv2d_486[0][0]                 
__________________________________________________________________________________________________
batch_normalization_490 (BatchN (None, None, None, 3 1152        conv2d_490[0][0]                 
__________________________________________________________________________________________________
activation_722 (Activation)     (None, None, None, 3 0           batch_normalization_486[0][0]    
__________________________________________________________________________________________________
activation_726 (Activation)     (None, None, None, 3 0           batch_normalization_490[0][0]    
__________________________________________________________________________________________________
conv2d_487 (Conv2D)             (None, None, None, 3 442368      activation_722[0][0]             
__________________________________________________________________________________________________
conv2d_488 (Conv2D)             (None, None, None, 3 442368      activation_722[0][0]             
__________________________________________________________________________________________________
conv2d_491 (Conv2D)             (None, None, None, 3 442368      activation_726[0][0]             
__________________________________________________________________________________________________
conv2d_492 (Conv2D)             (None, None, None, 3 442368      activation_726[0][0]             
__________________________________________________________________________________________________
average_pooling2d_44 (AveragePo (None, None, None, 2 0           mixed9[0][0]                     
__________________________________________________________________________________________________
conv2d_485 (Conv2D)             (None, None, None, 3 655360      mixed9[0][0]                     
__________________________________________________________________________________________________
batch_normalization_487 (BatchN (None, None, None, 3 1152        conv2d_487[0][0]                 
__________________________________________________________________________________________________
batch_normalization_488 (BatchN (None, None, None, 3 1152        conv2d_488[0][0]                 
__________________________________________________________________________________________________
batch_normalization_491 (BatchN (None, None, None, 3 1152        conv2d_491[0][0]                 
__________________________________________________________________________________________________
batch_normalization_492 (BatchN (None, None, None, 3 1152        conv2d_492[0][0]                 
__________________________________________________________________________________________________
conv2d_493 (Conv2D)             (None, None, None, 1 393216      average_pooling2d_44[0][0]       
__________________________________________________________________________________________________
batch_normalization_485 (BatchN (None, None, None, 3 960         conv2d_485[0][0]                 
__________________________________________________________________________________________________
activation_723 (Activation)     (None, None, None, 3 0           batch_normalization_487[0][0]    
__________________________________________________________________________________________________
activation_724 (Activation)     (None, None, None, 3 0           batch_normalization_488[0][0]    
__________________________________________________________________________________________________
activation_727 (Activation)     (None, None, None, 3 0           batch_normalization_491[0][0]    
__________________________________________________________________________________________________
activation_728 (Activation)     (None, None, None, 3 0           batch_normalization_492[0][0]    
__________________________________________________________________________________________________
batch_normalization_493 (BatchN (None, None, None, 1 576         conv2d_493[0][0]                 
__________________________________________________________________________________________________
activation_721 (Activation)     (None, None, None, 3 0           batch_normalization_485[0][0]    
__________________________________________________________________________________________________
mixed9_1 (Concatenate)          (None, None, None, 7 0           activation_723[0][0]             
                                                                 activation_724[0][0]             
__________________________________________________________________________________________________
concatenate_13 (Concatenate)    (None, None, None, 7 0           activation_727[0][0]             
                                                                 activation_728[0][0]             
__________________________________________________________________________________________________
activation_729 (Activation)     (None, None, None, 1 0           batch_normalization_493[0][0]    
__________________________________________________________________________________________________
mixed10 (Concatenate)           (None, None, None, 2 0           activation_721[0][0]             
                                                                 mixed9_1[0][0]                   
                                                                 concatenate_13[0][0]             
                                                                 activation_729[0][0]             
==================================================================================================
Total params: 21,802,784
Trainable params: 21,768,352
Non-trainable params: 34,432
__________________________________________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['mixed2', 'mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
266.1208989620209

Using non-Concatenate layers makes this algorithm blurry (tried uding Conv2D, as it worked for some of the others and created very interesting effects).

Including the last layers also turns this algorithm quite boring, while the top layers are more interesting.

Image 2 (Urban image – City)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.xception.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.Xception(include_top=False, weights='imagenet')
base_model.summary()
Model: "xception"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_26 (InputLayer)           [(None, None, None,  0                                            
__________________________________________________________________________________________________
block1_conv1 (Conv2D)           (None, None, None, 3 864         input_26[0][0]                   
__________________________________________________________________________________________________
block1_conv1_bn (BatchNormaliza (None, None, None, 3 128         block1_conv1[0][0]               
__________________________________________________________________________________________________
block1_conv1_act (Activation)   (None, None, None, 3 0           block1_conv1_bn[0][0]            
__________________________________________________________________________________________________
block1_conv2 (Conv2D)           (None, None, None, 6 18432       block1_conv1_act[0][0]           
__________________________________________________________________________________________________
block1_conv2_bn (BatchNormaliza (None, None, None, 6 256         block1_conv2[0][0]               
__________________________________________________________________________________________________
block1_conv2_act (Activation)   (None, None, None, 6 0           block1_conv2_bn[0][0]            
__________________________________________________________________________________________________
block2_sepconv1 (SeparableConv2 (None, None, None, 1 8768        block1_conv2_act[0][0]           
__________________________________________________________________________________________________
block2_sepconv1_bn (BatchNormal (None, None, None, 1 512         block2_sepconv1[0][0]            
__________________________________________________________________________________________________
block2_sepconv2_act (Activation (None, None, None, 1 0           block2_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block2_sepconv2 (SeparableConv2 (None, None, None, 1 17536       block2_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block2_sepconv2_bn (BatchNormal (None, None, None, 1 512         block2_sepconv2[0][0]            
__________________________________________________________________________________________________
conv2d_494 (Conv2D)             (None, None, None, 1 8192        block1_conv2_act[0][0]           
__________________________________________________________________________________________________
block2_pool (MaxPooling2D)      (None, None, None, 1 0           block2_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
batch_normalization_494 (BatchN (None, None, None, 1 512         conv2d_494[0][0]                 
__________________________________________________________________________________________________
add_76 (Add)                    (None, None, None, 1 0           block2_pool[0][0]                
                                                                 batch_normalization_494[0][0]    
__________________________________________________________________________________________________
block3_sepconv1_act (Activation (None, None, None, 1 0           add_76[0][0]                     
__________________________________________________________________________________________________
block3_sepconv1 (SeparableConv2 (None, None, None, 2 33920       block3_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block3_sepconv1_bn (BatchNormal (None, None, None, 2 1024        block3_sepconv1[0][0]            
__________________________________________________________________________________________________
block3_sepconv2_act (Activation (None, None, None, 2 0           block3_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block3_sepconv2 (SeparableConv2 (None, None, None, 2 67840       block3_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block3_sepconv2_bn (BatchNormal (None, None, None, 2 1024        block3_sepconv2[0][0]            
__________________________________________________________________________________________________
conv2d_495 (Conv2D)             (None, None, None, 2 32768       add_76[0][0]                     
__________________________________________________________________________________________________
block3_pool (MaxPooling2D)      (None, None, None, 2 0           block3_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
batch_normalization_495 (BatchN (None, None, None, 2 1024        conv2d_495[0][0]                 
__________________________________________________________________________________________________
add_77 (Add)                    (None, None, None, 2 0           block3_pool[0][0]                
                                                                 batch_normalization_495[0][0]    
__________________________________________________________________________________________________
block4_sepconv1_act (Activation (None, None, None, 2 0           add_77[0][0]                     
__________________________________________________________________________________________________
block4_sepconv1 (SeparableConv2 (None, None, None, 7 188672      block4_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block4_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block4_sepconv1[0][0]            
__________________________________________________________________________________________________
block4_sepconv2_act (Activation (None, None, None, 7 0           block4_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block4_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block4_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block4_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block4_sepconv2[0][0]            
__________________________________________________________________________________________________
conv2d_496 (Conv2D)             (None, None, None, 7 186368      add_77[0][0]                     
__________________________________________________________________________________________________
block4_pool (MaxPooling2D)      (None, None, None, 7 0           block4_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
batch_normalization_496 (BatchN (None, None, None, 7 2912        conv2d_496[0][0]                 
__________________________________________________________________________________________________
add_78 (Add)                    (None, None, None, 7 0           block4_pool[0][0]                
                                                                 batch_normalization_496[0][0]    
__________________________________________________________________________________________________
block5_sepconv1_act (Activation (None, None, None, 7 0           add_78[0][0]                     
__________________________________________________________________________________________________
block5_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block5_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block5_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block5_sepconv1[0][0]            
__________________________________________________________________________________________________
block5_sepconv2_act (Activation (None, None, None, 7 0           block5_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block5_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block5_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block5_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block5_sepconv2[0][0]            
__________________________________________________________________________________________________
block5_sepconv3_act (Activation (None, None, None, 7 0           block5_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block5_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block5_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block5_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block5_sepconv3[0][0]            
__________________________________________________________________________________________________
add_79 (Add)                    (None, None, None, 7 0           block5_sepconv3_bn[0][0]         
                                                                 add_78[0][0]                     
__________________________________________________________________________________________________
block6_sepconv1_act (Activation (None, None, None, 7 0           add_79[0][0]                     
__________________________________________________________________________________________________
block6_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block6_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block6_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block6_sepconv1[0][0]            
__________________________________________________________________________________________________
block6_sepconv2_act (Activation (None, None, None, 7 0           block6_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block6_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block6_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block6_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block6_sepconv2[0][0]            
__________________________________________________________________________________________________
block6_sepconv3_act (Activation (None, None, None, 7 0           block6_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block6_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block6_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block6_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block6_sepconv3[0][0]            
__________________________________________________________________________________________________
add_80 (Add)                    (None, None, None, 7 0           block6_sepconv3_bn[0][0]         
                                                                 add_79[0][0]                     
__________________________________________________________________________________________________
block7_sepconv1_act (Activation (None, None, None, 7 0           add_80[0][0]                     
__________________________________________________________________________________________________
block7_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block7_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block7_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block7_sepconv1[0][0]            
__________________________________________________________________________________________________
block7_sepconv2_act (Activation (None, None, None, 7 0           block7_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block7_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block7_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block7_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block7_sepconv2[0][0]            
__________________________________________________________________________________________________
block7_sepconv3_act (Activation (None, None, None, 7 0           block7_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block7_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block7_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block7_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block7_sepconv3[0][0]            
__________________________________________________________________________________________________
add_81 (Add)                    (None, None, None, 7 0           block7_sepconv3_bn[0][0]         
                                                                 add_80[0][0]                     
__________________________________________________________________________________________________
block8_sepconv1_act (Activation (None, None, None, 7 0           add_81[0][0]                     
__________________________________________________________________________________________________
block8_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block8_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block8_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block8_sepconv1[0][0]            
__________________________________________________________________________________________________
block8_sepconv2_act (Activation (None, None, None, 7 0           block8_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block8_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block8_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block8_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block8_sepconv2[0][0]            
__________________________________________________________________________________________________
block8_sepconv3_act (Activation (None, None, None, 7 0           block8_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block8_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block8_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block8_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block8_sepconv3[0][0]            
__________________________________________________________________________________________________
add_82 (Add)                    (None, None, None, 7 0           block8_sepconv3_bn[0][0]         
                                                                 add_81[0][0]                     
__________________________________________________________________________________________________
block9_sepconv1_act (Activation (None, None, None, 7 0           add_82[0][0]                     
__________________________________________________________________________________________________
block9_sepconv1 (SeparableConv2 (None, None, None, 7 536536      block9_sepconv1_act[0][0]        
__________________________________________________________________________________________________
block9_sepconv1_bn (BatchNormal (None, None, None, 7 2912        block9_sepconv1[0][0]            
__________________________________________________________________________________________________
block9_sepconv2_act (Activation (None, None, None, 7 0           block9_sepconv1_bn[0][0]         
__________________________________________________________________________________________________
block9_sepconv2 (SeparableConv2 (None, None, None, 7 536536      block9_sepconv2_act[0][0]        
__________________________________________________________________________________________________
block9_sepconv2_bn (BatchNormal (None, None, None, 7 2912        block9_sepconv2[0][0]            
__________________________________________________________________________________________________
block9_sepconv3_act (Activation (None, None, None, 7 0           block9_sepconv2_bn[0][0]         
__________________________________________________________________________________________________
block9_sepconv3 (SeparableConv2 (None, None, None, 7 536536      block9_sepconv3_act[0][0]        
__________________________________________________________________________________________________
block9_sepconv3_bn (BatchNormal (None, None, None, 7 2912        block9_sepconv3[0][0]            
__________________________________________________________________________________________________
add_83 (Add)                    (None, None, None, 7 0           block9_sepconv3_bn[0][0]         
                                                                 add_82[0][0]                     
__________________________________________________________________________________________________
block10_sepconv1_act (Activatio (None, None, None, 7 0           add_83[0][0]                     
__________________________________________________________________________________________________
block10_sepconv1 (SeparableConv (None, None, None, 7 536536      block10_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block10_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block10_sepconv1[0][0]           
__________________________________________________________________________________________________
block10_sepconv2_act (Activatio (None, None, None, 7 0           block10_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block10_sepconv2 (SeparableConv (None, None, None, 7 536536      block10_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block10_sepconv2_bn (BatchNorma (None, None, None, 7 2912        block10_sepconv2[0][0]           
__________________________________________________________________________________________________
block10_sepconv3_act (Activatio (None, None, None, 7 0           block10_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
block10_sepconv3 (SeparableConv (None, None, None, 7 536536      block10_sepconv3_act[0][0]       
__________________________________________________________________________________________________
block10_sepconv3_bn (BatchNorma (None, None, None, 7 2912        block10_sepconv3[0][0]           
__________________________________________________________________________________________________
add_84 (Add)                    (None, None, None, 7 0           block10_sepconv3_bn[0][0]        
                                                                 add_83[0][0]                     
__________________________________________________________________________________________________
block11_sepconv1_act (Activatio (None, None, None, 7 0           add_84[0][0]                     
__________________________________________________________________________________________________
block11_sepconv1 (SeparableConv (None, None, None, 7 536536      block11_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block11_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block11_sepconv1[0][0]           
__________________________________________________________________________________________________
block11_sepconv2_act (Activatio (None, None, None, 7 0           block11_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block11_sepconv2 (SeparableConv (None, None, None, 7 536536      block11_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block11_sepconv2_bn (BatchNorma (None, None, None, 7 2912        block11_sepconv2[0][0]           
__________________________________________________________________________________________________
block11_sepconv3_act (Activatio (None, None, None, 7 0           block11_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
block11_sepconv3 (SeparableConv (None, None, None, 7 536536      block11_sepconv3_act[0][0]       
__________________________________________________________________________________________________
block11_sepconv3_bn (BatchNorma (None, None, None, 7 2912        block11_sepconv3[0][0]           
__________________________________________________________________________________________________
add_85 (Add)                    (None, None, None, 7 0           block11_sepconv3_bn[0][0]        
                                                                 add_84[0][0]                     
__________________________________________________________________________________________________
block12_sepconv1_act (Activatio (None, None, None, 7 0           add_85[0][0]                     
__________________________________________________________________________________________________
block12_sepconv1 (SeparableConv (None, None, None, 7 536536      block12_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block12_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block12_sepconv1[0][0]           
__________________________________________________________________________________________________
block12_sepconv2_act (Activatio (None, None, None, 7 0           block12_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block12_sepconv2 (SeparableConv (None, None, None, 7 536536      block12_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block12_sepconv2_bn (BatchNorma (None, None, None, 7 2912        block12_sepconv2[0][0]           
__________________________________________________________________________________________________
block12_sepconv3_act (Activatio (None, None, None, 7 0           block12_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
block12_sepconv3 (SeparableConv (None, None, None, 7 536536      block12_sepconv3_act[0][0]       
__________________________________________________________________________________________________
block12_sepconv3_bn (BatchNorma (None, None, None, 7 2912        block12_sepconv3[0][0]           
__________________________________________________________________________________________________
add_86 (Add)                    (None, None, None, 7 0           block12_sepconv3_bn[0][0]        
                                                                 add_85[0][0]                     
__________________________________________________________________________________________________
block13_sepconv1_act (Activatio (None, None, None, 7 0           add_86[0][0]                     
__________________________________________________________________________________________________
block13_sepconv1 (SeparableConv (None, None, None, 7 536536      block13_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block13_sepconv1_bn (BatchNorma (None, None, None, 7 2912        block13_sepconv1[0][0]           
__________________________________________________________________________________________________
block13_sepconv2_act (Activatio (None, None, None, 7 0           block13_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block13_sepconv2 (SeparableConv (None, None, None, 1 752024      block13_sepconv2_act[0][0]       
__________________________________________________________________________________________________
block13_sepconv2_bn (BatchNorma (None, None, None, 1 4096        block13_sepconv2[0][0]           
__________________________________________________________________________________________________
conv2d_497 (Conv2D)             (None, None, None, 1 745472      add_86[0][0]                     
__________________________________________________________________________________________________
block13_pool (MaxPooling2D)     (None, None, None, 1 0           block13_sepconv2_bn[0][0]        
__________________________________________________________________________________________________
batch_normalization_497 (BatchN (None, None, None, 1 4096        conv2d_497[0][0]                 
__________________________________________________________________________________________________
add_87 (Add)                    (None, None, None, 1 0           block13_pool[0][0]               
                                                                 batch_normalization_497[0][0]    
__________________________________________________________________________________________________
block14_sepconv1 (SeparableConv (None, None, None, 1 1582080     add_87[0][0]                     
__________________________________________________________________________________________________
block14_sepconv1_bn (BatchNorma (None, None, None, 1 6144        block14_sepconv1[0][0]           
__________________________________________________________________________________________________
block14_sepconv1_act (Activatio (None, None, None, 1 0           block14_sepconv1_bn[0][0]        
__________________________________________________________________________________________________
block14_sepconv2 (SeparableConv (None, None, None, 2 3159552     block14_sepconv1_act[0][0]       
__________________________________________________________________________________________________
block14_sepconv2_bn (BatchNorma (None, None, None, 2 8192        block14_sepconv2[0][0]           
__________________________________________________________________________________________________
block14_sepconv2_act (Activatio (None, None, None, 2 0           block14_sepconv2_bn[0][0]        
==================================================================================================
Total params: 20,861,480
Trainable params: 20,806,952
Non-trainable params: 54,528
__________________________________________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['block1_conv1', 'block1_conv2']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
12.04287314414978

The upper layers make the image look like it is black and white or it is a little retro. The lower layers make the image look like it is drawn. The upper layers for this example make the image look almost like an old camera has some errors or the video is a little corrupted.

Image 3 (Artistic image –The Fall Of Phaeton)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')
base_model.summary()
WARNING:tensorflow:`input_shape` is undefined or non-square, or `rows` is not in [128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default.
Model: "mobilenet_1.00_224"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_27 (InputLayer)        [(None, None, None, 3)]   0         
_________________________________________________________________
conv1 (Conv2D)               (None, None, None, 32)    864       
_________________________________________________________________
conv1_bn (BatchNormalization (None, None, None, 32)    128       
_________________________________________________________________
conv1_relu (ReLU)            (None, None, None, 32)    0         
_________________________________________________________________
conv_dw_1 (DepthwiseConv2D)  (None, None, None, 32)    288       
_________________________________________________________________
conv_dw_1_bn (BatchNormaliza (None, None, None, 32)    128       
_________________________________________________________________
conv_dw_1_relu (ReLU)        (None, None, None, 32)    0         
_________________________________________________________________
conv_pw_1 (Conv2D)           (None, None, None, 64)    2048      
_________________________________________________________________
conv_pw_1_bn (BatchNormaliza (None, None, None, 64)    256       
_________________________________________________________________
conv_pw_1_relu (ReLU)        (None, None, None, 64)    0         
_________________________________________________________________
conv_pad_2 (ZeroPadding2D)   (None, None, None, 64)    0         
_________________________________________________________________
conv_dw_2 (DepthwiseConv2D)  (None, None, None, 64)    576       
_________________________________________________________________
conv_dw_2_bn (BatchNormaliza (None, None, None, 64)    256       
_________________________________________________________________
conv_dw_2_relu (ReLU)        (None, None, None, 64)    0         
_________________________________________________________________
conv_pw_2 (Conv2D)           (None, None, None, 128)   8192      
_________________________________________________________________
conv_pw_2_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_pw_2_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_dw_3 (DepthwiseConv2D)  (None, None, None, 128)   1152      
_________________________________________________________________
conv_dw_3_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_dw_3_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_pw_3 (Conv2D)           (None, None, None, 128)   16384     
_________________________________________________________________
conv_pw_3_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_pw_3_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_pad_4 (ZeroPadding2D)   (None, None, None, 128)   0         
_________________________________________________________________
conv_dw_4 (DepthwiseConv2D)  (None, None, None, 128)   1152      
_________________________________________________________________
conv_dw_4_bn (BatchNormaliza (None, None, None, 128)   512       
_________________________________________________________________
conv_dw_4_relu (ReLU)        (None, None, None, 128)   0         
_________________________________________________________________
conv_pw_4 (Conv2D)           (None, None, None, 256)   32768     
_________________________________________________________________
conv_pw_4_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_pw_4_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_dw_5 (DepthwiseConv2D)  (None, None, None, 256)   2304      
_________________________________________________________________
conv_dw_5_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_dw_5_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_pw_5 (Conv2D)           (None, None, None, 256)   65536     
_________________________________________________________________
conv_pw_5_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_pw_5_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_pad_6 (ZeroPadding2D)   (None, None, None, 256)   0         
_________________________________________________________________
conv_dw_6 (DepthwiseConv2D)  (None, None, None, 256)   2304      
_________________________________________________________________
conv_dw_6_bn (BatchNormaliza (None, None, None, 256)   1024      
_________________________________________________________________
conv_dw_6_relu (ReLU)        (None, None, None, 256)   0         
_________________________________________________________________
conv_pw_6 (Conv2D)           (None, None, None, 512)   131072    
_________________________________________________________________
conv_pw_6_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_6_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_7 (DepthwiseConv2D)  (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_7_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_7_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_7 (Conv2D)           (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_7_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_7_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_8 (DepthwiseConv2D)  (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_8_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_8_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_8 (Conv2D)           (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_8_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_8_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_9 (DepthwiseConv2D)  (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_9_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_9_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_9 (Conv2D)           (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_9_bn (BatchNormaliza (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_9_relu (ReLU)        (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_10 (DepthwiseConv2D) (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_10_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_10_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_10 (Conv2D)          (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_10_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_10_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_11 (DepthwiseConv2D) (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_11_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_11_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_11 (Conv2D)          (None, None, None, 512)   262144    
_________________________________________________________________
conv_pw_11_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_pw_11_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pad_12 (ZeroPadding2D)  (None, None, None, 512)   0         
_________________________________________________________________
conv_dw_12 (DepthwiseConv2D) (None, None, None, 512)   4608      
_________________________________________________________________
conv_dw_12_bn (BatchNormaliz (None, None, None, 512)   2048      
_________________________________________________________________
conv_dw_12_relu (ReLU)       (None, None, None, 512)   0         
_________________________________________________________________
conv_pw_12 (Conv2D)          (None, None, None, 1024)  524288    
_________________________________________________________________
conv_pw_12_bn (BatchNormaliz (None, None, None, 1024)  4096      
_________________________________________________________________
conv_pw_12_relu (ReLU)       (None, None, None, 1024)  0         
_________________________________________________________________
conv_dw_13 (DepthwiseConv2D) (None, None, None, 1024)  9216      
_________________________________________________________________
conv_dw_13_bn (BatchNormaliz (None, None, None, 1024)  4096      
_________________________________________________________________
conv_dw_13_relu (ReLU)       (None, None, None, 1024)  0         
_________________________________________________________________
conv_pw_13 (Conv2D)          (None, None, None, 1024)  1048576   
_________________________________________________________________
conv_pw_13_bn (BatchNormaliz (None, None, None, 1024)  4096      
_________________________________________________________________
conv_pw_13_relu (ReLU)       (None, None, None, 1024)  0         
=================================================================
Total params: 3,228,864
Trainable params: 3,206,976
Non-trainable params: 21,888
_________________________________________________________________
In [ ]:
# Maximize the activations of these layers
names = ['conv_pw_9', 'conv_pw_10', 'conv_pw_11']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)
In [ ]:
start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
124.72214221954346

The lower layers make the image look like the cubism trend, while the upper layers exadurate the colour pink/red. Using the middle elements, there appear many swirls and make the image almost unreadable from the input image.

Overview

My favourites and the ones used will be:

Natural image: mixed3, mixed5 (the first one created when experimenting with covnet types)

Urban image: block1_conv1, block1_conv2

Art image: conv_pw_11, conv_pw_12, con_pw_13

Octave scale

Image 1 (Natural image – Venus Fly Trap)

We will no longer need the base_model.summary() as the layers have been chosen. We can therefore, now concatenate even further and make the code even more easy to run multiple times (by putting it in one block).

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 2.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
909.1170997619629

A very low scale makes the image blurry, a very large scale changes it too much from the original (you can almost see a chameleon form). Using around 1.0 is perfect as it makes it clearer and interesting

Image 2 (Urban image – City)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.xception.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.Xception(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['block1_conv1', 'block1_conv2']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.50

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
17.433922052383423

Again, using the scale of 1.0 makes the image clearer and even like a camera. Using higher, makes it look more blurry and almost like a game render (older game).

Image 3 (Artistic image –The Fall Of Phaeton)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['conv_pw_11', 'conv_pw_12', 'conv_pw_13']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.50

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.01)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
212.11888074874878

The scale as 1.0 makes the image too clear and takes away from the cubism. Scale 1.50 does not look as nice as 1.30 in the image (1.50 can be made out a little too much).

Overview

My favourites and the ones used will be:

Natural image: 1.0

Urban image: 1.0

Art image: 1.30

Step size

Image 1 (Natural image – Venus Fly Trap)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 2.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.03)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
919.5006318092346

Eventhough I accidently used the wrong octave scale, the image took too long to process and was too messy for 0.03. Using the value 0.008 made the image is the same as 0.01, and takes the same time to run (maybe 0.008 is even a little better).

Image 2 (Urban image – City)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.xception.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.Xception(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['block1_conv1', 'block1_conv2']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.009)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
11.460044622421265

0.02 becomes too grainy, and 0.008 is quite good and a little clearer. I think 0.009 is the best midpoint to use.

Image 3 (Artistic image –The Fall Of Phaeton)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['conv_pw_11', 'conv_pw_12', 'conv_pw_13']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=50, step_size=0.02)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
137.74821877479553

Using 0.02 makes the image the brightest and now it even looks wrinkled. It even looks like cubism but with a lot of detail so I think this is working as one of the best images so far.

Overview

My favourites and the ones used will be:

Natural image: 0.008

Urban image: 0.009

Art image: 0.02

Steps

Image 1 (Natural image – Venus Fly Trap)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=100, step_size=0.008)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
402.3557391166687

Going upto the steps of 100, the image because more 'in depth' as there are more features. 100 looks the best, with 60 being close to it.

Image 2 (Urban image – City)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.xception.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.Xception(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['block1_conv1', 'block1_conv2']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=30, step_size=0.009)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
17.697973489761353

The higher the number of layers, the more pixelated is the image. I tried going with less layers (even less than the 20 used, it was used from when the blue was too much for 50 steps). I think 20 steps however, is best as it makes it look like a grainy photo.

Image 3 (Artistic image –The Fall Of Phaeton)

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['conv_pw_11', 'conv_pw_12', 'conv_pw_13']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=100, step_size=0.02)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
376.9001383781433

Eventhough, 100 steps makes the image intense and cool, it becomes almost unseen. Using 20 steps actually makes it smoother and makes it look better for the overall and a little closer to cubism/the original.

Overview

My favourites and the ones used will be:

Natural image: 100

Urban image: 20

Art image: 20

Black and White

Just for an extra check, I used https://online-photo-converter.com/black-and-white-image to convert the 3 photos into black and white. I will now use the finalised models to also train them and see the results (out of curiosity).

In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/venusFlyBW.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=100, step_size=0.008)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
402.46393060684204
In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.xception.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/cityBW.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.Xception(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['block1_conv1', 'block1_conv2']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.009)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
9.521480321884155
In [ ]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaetonBW.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['conv_pw_11', 'conv_pw_12', 'conv_pw_13']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.02)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[ ]:
63.48162531852722

Overall the natural and artistic model is quite interesting, while the urban model performs similar to a coloured image. Also, to point out in the natural algorithm the colour green and red (along with others) still appear when no colour was present at first.

Final Pieces

I am very happy with the outcomes I have produced. The exploration really allowed me to understand the code, and therefore, I was able to manipulate it to my liking. I think my favourite would be algorithm 3, but they all bring something different. Natural makes the images very interesting and like a dream. Urban makes them like an old photograph and it has an effect where the smaller the image the more purple it becomes. Artistic makes images quite close to the cubism style (or even wrinkled paper effect) which was its intention.

They will be run one more time and results displayed here and in the pdf with all of the produced images (too many for colab it would have been too many examples and be too messy to read the notebook).

I will also run some other images (also gotten from https://pixabay.com/) with the models to see how they would perform and to get an idea if they would be usuable for other images.

Whatever the outcome becomes, I am happy with the reuslts of the 3 images and my gained understanding.

In [172]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/venusFly.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=100, step_size=0.008)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[172]:
402.17784214019775
In [173]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.xception.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/city.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.Xception(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['block1_conv1', 'block1_conv2']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.009)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[173]:
9.684582948684692
In [174]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/theFallOfPhaeton.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['conv_pw_11', 'conv_pw_12', 'conv_pw_13']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.02)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[174]:
62.32569599151611

Extra

Running some images the networks were not trained with. Most will be in the pdf of all images.

In [178]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.mobilenet.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/deer.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.MobileNet(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['conv_pw_11', 'conv_pw_12', 'conv_pw_13']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.30

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=20, step_size=0.02)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[178]:
57.86771893501282
In [179]:
# The fuction needs to be here as you have to change the convet here too
def run_deep_dream_simple(img, steps=100, step_size=0.01):
  # Convert from uint8 to the range expected by the model.
  img = tf.keras.applications.inception_v3.preprocess_input(img)
  img = tf.convert_to_tensor(img)
  step_size = tf.convert_to_tensor(step_size)
  steps_remaining = steps
  step = 0
  while steps_remaining:
    if steps_remaining>100:
      run_steps = tf.constant(100)
    else:
      run_steps = tf.constant(steps_remaining)
    steps_remaining -= run_steps
    step += run_steps

    loss, img = deepdream(img, run_steps, tf.constant(step_size))

    display.clear_output(wait=True)
    show(deprocess(img))
    print ("Step {}, loss {}".format(step, loss))

  result = deprocess(img)
  display.clear_output(wait=True)
  show(result)

  return result

url = 'https://github.com/SimasCes/DeepDream/blob/main/taxi.jpg?raw=true'
# Downsizing the image makes it easier to work with.
original_img = download(url, max_dim=500)

# Define the model and then see a summary table
base_model = tf.keras.applications.InceptionV3(include_top=False, weights='imagenet')

# Maximize the activations of these layers
names = ['mixed3', 'mixed5']
layers = [base_model.get_layer(name).output for name in names]

# Create the feature extraction model
dream_model = tf.keras.Model(inputs=base_model.input, outputs=layers)

deepdream = DeepDream(dream_model)

start = time.time()

OCTAVE_SCALE = 1.0

img = tf.constant(np.array(original_img))
base_shape = tf.shape(img)[:-1]
float_base_shape = tf.cast(base_shape, tf.float32)

for n in range(-2, 3):
  new_shape = tf.cast(float_base_shape*(OCTAVE_SCALE**n), tf.int32)

  img = tf.image.resize(img, new_shape).numpy()

  img = run_deep_dream_simple(img=img, steps=100, step_size=0.008)

display.clear_output(wait=True)
img = tf.image.resize(img, base_shape)
img = tf.image.convert_image_dtype(img/255.0, dtype=tf.uint8)
show(img)

end = time.time()
end-start
Out[179]:
360.62330865859985

I think the styles have worked, and there are interesting results when trying a natural image with artistic, or an urban image with natural...